1. 7

    At that time, when you turned on your computer, you immediately had programming language available. Even in 90’s, there was QBasic installed on almost all PCs. Interpreter and editor in one, so it was very easy to enter the world of programming. Kids could learn it themselves with cheap books and magazines with lot of BASIC program listings. And I think the most important thing - kids were curious about computers. I can see that today, the role of BASIC is taken by Minecraft. I wouldn’t underestimate it as a trigger for a new generation of engineers and developers. Add more physics and more logic into it and it will be excellent playground like BASIC was in 80s.

    1. 5

      Now we have the raspberry pi, arduino, python, scratch and so many other ways kids can get started.

      1. 10

        Right, but at the beginning you have to spend a lot of time more to show kid how to setup everything properly. I admire that it itself is fun, but in 80’s you just turned computer on with one switch and environment was literally READY :)

        1. 7

          I think the problem is that back then there was much less competition for kids attention. The biggest draw was TV. TV – that played certain shows on a particular schedule, with lots of re-runs. If there was nothing on, but you had a computer nearby, you could escape and unleash your creativity there.

          Today – there’s perpetual phones/tablets/computers and mega-society level connectivity. There’s no time during which they can’t find out what their friends are up to.

          Even for me – to immerse myself in a computer, exploring programming – it’s harder to do than it was ten years ago.

          1. 5

            I admire that it itself is fun, but in 80’s you just turned computer on with one switch and environment was literally READY :)

            We must be using some fairly narrow definition of “the ‘80s”, because this is a seriously rose-tinted description of learning to program at the time. By the late 80’s, with the rise of the Mac and Windows, the only way to learn to program involved buying a commercial compiler.

            I had to beg for a copy of “Just Enough Pascal” in 1988, which came with a floppy containing a copy of Think’s Lightspeed Pascal compiler, and retailed for the equivalent of $155.

            Kids these days have it comparatively easy – all the tools are free.

            1. 1

              Windows still shipped with QBasic well into the 90s, and Macs shipped with HyperCard. It wasn’t quite one-click hacking, but it was still far more accessible than today.

            2. 4

              Just open the web-tools in your browser, you’ll have an already configured javascript development environment.

              I entirely agree with you on

              And I think the most important thing - kids were curious about computers.

              You don’t need to understand how a computer program is made to use it anymore; which is not necessary something bad.

              1. 4

                That’s still not the same. kred is saying it was first thing you see with you immediately able to use it. It was also a simple language designed to be easy to learn. Whereas, you have to go out of your way to get to JS development environment on top of learning complex language and concepts. More complexity. More friction. Less uptake.

                The other issue that’s not addressed enough in these write-ups is that modern platforms have tons of games that treat people as consumers with psychological techniques to keep them addicted. They also build boxes around their mind where they can feel like they’re creating stuff without learning much in useful, reusable skill versus prior generation’s toys. Kids can get the consumer and creator high without doing real creation. So, now they have to ignore that to do the high-friction stuff above to get to the basics of creating that existed for old generation. Most won’t want to do it because it’s not as fun as their apps and games.

                1. 1

                  There is no shortage of programmers now. We are not facing any issues with not enough kids learning programming.

                  1. 2

                    I didnt say there was a shortage of programmers. I said most kids were learning computers in a way that trained them to be consumers vs creators. You’d have to compare what people do in consumer platforms versus things like Scratch to get an idea of what we’re missing out on.

            3. 4

              All of those require a lot more setup than older machines where you flipped a switch and got dropped into a dev environment.

              The Arduino is useless if you don’t have a project, a computer already configured for development, and electronics breadboarding to talk to it. The Raspberry pi is a weird little circuit board that, until you dismantle your existing computer and hook everything up, can’t do anything–and when you do get it hooked up, you’re greeted with Linux. Python is large and hard to put images to on the screen or make noises with in a few lines of code.

              Scratch is maybe the closest, but it still has the “what programmers doing education think is simple” problem instead of the “simple tools for programming in a barebones environment that learners can manage”.

              The field of programming education is broken in this way. It’s a systemic worldview problem.

              1. 1

                Those aren’t even close in terms of ease of use.

                My elementary school circa 1988 had a lab full of these Apple IIe systems, and my recollection (I was about 6 years old at the time, so I may be misremembering) is that by default they booted into a BASIC REPL.

                Raspberry Pis and Arduinos are fun, but they’re a lot more complex and difficult to work with.

              2. 3

                I don’t think kids are less curious today, but it’s important to notice that back then, making a really polished program that felt professional only needed a small amount of comparatively simple work - things like prompting for all your inputs explicitly rather than hard-coding them, and making sure your colored backgrounds were redrawn properly after editing.

                To make a polished GUI app today is prohibitive in terms of time expenditure and diversity of knowledge needed. The web is a little better, but not by much. So beginners are often left with a feeling that their work is inadequate and not worth sharing. The ones who decide to be okay with that and talk about what they’ve done anyway show remarkable courage - and they’re pretty rare.

                Also, of course, back then there was no choice of which of the many available on-ramps to start with. You learned the language that came with your computer, and if you got good enough maybe you learned assembly or asked your parents to save up and buy you a compiler. Today, as you say, things like Minecraft are among the options. As common starting points I’d also like to mention Node and PHP, both ecosystems which owe a lot of their popularity to their efforts to reduce the breadth of knowledge needed to build end-to-end systems.

                But in addition to being good starting points, those ecosystems have something else in common - there are lots of people who viscerally hate them and aren’t shy about saying so. A child just starting out is going to be highly intimidated by that, and feel that they have no way to navigate whether the technical considerations the adults are yelling about are really that important or not. In a past life, I taught middle-school, and it gave me an opportunity to watch young people being pushed away by cultural factors despite their determination to learn. It was really disheartening.

                Navigating the complicated choices of where to start learning is really challenging, no matter what age you are. But for children, it’s often impossible, or too frightening to try.

                I agree with what I took to be your main point, that if those of us who learned young care about helping the next generation to follow in our footsteps, we should meet them where they are and make sure to build playgrounds that they can enjoy with or without a technical understanding. But my real prediction is that the cultural factors are going to continue to be a blocker, and programming is unlikely to again be a thing that children have widespread mastery of in the way that it was in the 80s. It’s really very saddening.

              1. 3

                I’ve tried to use Darktable (and Rawtherapee) for a few times without success. Both have tremendous count of features but compared to Lightroom they lack of simple UI. I wish there will be option for only one panel with reasonable set of settings available. I feel that amount of features there for casual photographer like me is too much. BTW, I loved how old Google Picasa worked - Darktable/Rawtherapee could think about that tool.

                1. 2

                  Oh, I can fully recommended giving it one afternoon with some video tutorials. After that, you should be comfortable with the basics functionality. If I remember correctly, you can customize the interface to show the panels that you like.

                  I switched to Lightroom a while a go, mostly, because my most powerful machine was a Mac with Mac OS. I had trouble finding things in lightroom for a while and thought that darkroom is organized more logically.

                  1. 1

                    I’ve used Darktable for many years now and I find the UI to be really great. It is preciously what you need without being too complicated. It is geared towards professional use. Use a couple of hours with Darktable and you’ll be right at home. It is easy to configure a set of features and only use those.

                  1. 1

                    Sorry, but that’s the way ink jet printer works. You don’t want to have your printhead clogged with dry ink, so they have to clean it sometimes. It’s like cars’ FAP filter :) They could add some kind of tray or replaceable reservoir but probably that was economically (for manufacturers) non-viable - your printhead will be dead first. Nevertheless there is some progress in ink-jet printers - epson, brother or hp sells printers with build-in CISS now, so at least you can save more on ink than before. You don’t have to buy new non-eco plastic cartridges anymore. By the way, there is an aftermarket replacement for these sponges and there is a software (although not for all models) that can reset counter.

                    1. 4

                      I use quite a number of SoPines and Pine64s running Armbian everyday for work. Hardware support has been great (including the HDMI and XFCE desktop), so I’d recommend that combination.

                      I haven’t used their Pinebook yet, but I assume it’s equally good.

                      1. 3

                        Does the SoC have open source GPU drivers? I’ve seen on the Pine A64 page that it runs mainline kernel, but I’m not sure if that includes the GPU driver.

                        EDIT:

                        I found it in their FAQ. It has Mali 400 MP2 GPU. So I guess it’s not horrible, but not good either.

                        1. 3

                          For open GPU you need etnaviv at this point, so i.MX6 or similar

                        2. 2

                          Good to know. I assume this is with the legacy 3.10 kernel, not mainline. Reading up on the situation, I don’t see a reason to believe Allwinner will continue to support the hardware or update the supported kernel. I don’t know how I feel about that. On the other hand, even the Pinebook barely costs anything compared to the various Chromebooks. The 2gb RAM maximum and low resolution eliminate it as a daily driver for me, but at that price I might just get one for the hell of it.

                          1. 2

                            Yes, we’re using the 3.10 kernel.

                            1. 2

                              There is documentation for Allwinner SoC (at least H3) available to all, but chapter describing configuration of HDMI output even when listed in index, is missing. Documentation for graphics chip is the same sad story.

                          1. 2

                            I am seriously impressed. Around year ago I dropped firefox for chromium because sites like 500px were visibly slow. I just tried Quantum and I see great improvement. Awesome job!

                            1. 2

                              I’m very amazed that even after 10 years, this device has some update path that allows to use it using current services. I own Amazon Fire 8.9’’ tablet and I’m very glad that it is still supported (but I go Cyanogenmod path).

                              1. 3

                                The actual failure was the decision to charge OEMs for the software. Yeah, it works a treat on the desktop where you own ~90% of the market. It’s a lot less attractive when your #1 competitor gives their software away for free. They were prevented by their success from being able to see the real shape of the market. All the sound and fury, all that money flushed down the toilet with the Nokia acquisition, it was all rearranging deckchairs. They were doomed from jump street.

                                1. 3

                                  In an alternative reality, we would use Nokia phone (probably N14 now) based on Maemo/Meego. I owned N9 and loved it. If Intel + Nokia would get along correctly, we could have real alternative to Android/iOS.

                                  1. 1

                                    Yeah, Nokia’s suicide was very frustrating. With the Pyra and Librem5 on the way, hoping to get back to this world :)

                                    1. 1

                                      I’m curious – what could Nokia have done? They were minting money right up until they weren’t; it seems iPhone caught them out just as it did everyone else.

                                      1. 1

                                        Consider the possibility that Apple had gotten something right and it wasnt just a fad. After the initial surprise, a lot of competitors spent a long while in denial about which features were and were not business essential.

                                        1. 1

                                          Well, yeah. But there was basically no way for Nokia (or RIM) as a company to see the nature of the existential threat that iPhone posed, given the limitations of their corporate cultures.

                                1. 1

                                  Divide and conquer - the more currencies on the market, the less value they represent and people are more confused. This is very good for ‘old economy’.

                                  1. 5

                                    I prefer to solve these tests that answering boring questions about const-correctness or operator precedence, even if it may take several days to solve. Currently I’m finishing such assignment (tool loosely connected to their field) where company I’m applying to expects to deliver code + documentation + tests. Code @github may not necessarily reflects quality of your code, especially when you work for a company that does not show their code or take part in open-source movement.

                                    1. 3

                                      I used boost::any and all I can say is that it works. This and std::variant are not the prettiest solutions, but because it is in standard, it is better than no solution. I very like simplicity of QVariant from Qt, but syntax for adding custom types leads to the same problems as it is in boost::any. I hope that in a close future, c++ will get real pattern matching syntax and then all explicit conversions to proper type will be only nightmare from the past.

                                      1. 3

                                        When I was student in 2000-2001, I was writing text editor oriented for web pages for BeOS called ‘Herring’. There is web archive mirror of my site from that time, but screenshots were not backed up. I think I’m missing source code, but I should have some binary executable ready to run in R5 :) Oh, fond memories.

                                        1. 5

                                          If you happen to have the binaries anywhere, you could always recompile everything for Haiku. Might be a fun weekend project.

                                          1. 2

                                            Thanks for hint, maybe on some lazy day I will give it a try.

                                        1. 3

                                          The moral of the story? You can’t hide on the internet anymore. Your sentence structure and word use is MORE unique than your own fingerprint. If an organization, like the NSA, wants to find you they will.

                                          I can’t change my fingerprint, at least not without some painful self-mutilation. I can change my writing style whenever I’m logged in to my troll accounts. And as the linked github project hints, it oughtn’t be too hard to use a program to normalise one’s writing style.

                                            1. 1

                                              Maybe some kind of translator from english to globish could help. Translation from more complex language structures to the simpler ones, so fingerprints of your own language will be lost.

                                              1. 1

                                                Google translate, and back again. If this post is correct, it was the usage of 50 words. So, really, you need a thesaurus integrated into your writing setup to vary adjectives and adverbs and such, too.

                                            1. 2

                                              I have an impression that when you run this ‘new fast engine in ******’ on older hardware, it will be the same sluggish program as it was before implementing new approach. It is just illusion created by faster and more efficient processors :)

                                              1. 4

                                                The ability to make proper use of those faster and more efficient processors is already quite the improvement none the less.

                                                1. 1

                                                  Yes, it could be interesting to compare builds of firefox or chrome compiled by e.g. 10-years old and recent gcc or clang.

                                                2. 2

                                                  Where’d you get that impression? My understanding is that it’s universally faster.

                                                  1. 1

                                                    My mother’s old two cores Athlon64 based computer gets updates regularly (it has ubuntu installed) - just 10 years old machine. She uses Firefox and I don’t feel that it has any improvement. I think that you may notice some speed up on recent hardware as new engine may be better optimized for new instructions or use more threads, which are not available on the older one.

                                                    1. 1

                                                      Your mother is using a nightly Firefox build? Quantum things are bound to land in stable in Firefox 57, which has a release date middle of November.

                                                      1. 1

                                                        Of course she is not using nightly builds. But as far as remember, Firefox is improving every stable release, so I assume that I (or my mother) should see these improvements on the old machine.

                                                        1. 2

                                                          These improvements aren’t in stable yet, only in nightly, so if you’re not using nightly, then you’re not testing these improvements.

                                                          1. 2

                                                            That makes sense, thanks!

                                                1. 4

                                                  The ‘Bare bone’ part is very nice introduction about how to start writing x86-64 OS in any language. Other thing I noticed is x86 assembler. I do not have much experience with it, but I noticed that even it is CISC processor (at the end), still it is used in some kind of RISC manner: move some constant to the register and then move it to the other register - look at e.g. enable_paging procedure. I always had an impression that in x86 it could be done as a one assembler instruction.

                                                  1. 2

                                                    I think in a lot of cases this is necessitated by the instruction encoding. x86_64 uses 3 or 4 bits to represent a register, which works well for the 16 general-purpose registers, but to access other registers you need separate instructions.

                                                  1. 2

                                                    I like how Google/Android simplified the stack, regular (desktop) Linux solution is some kind of schizophrenic dream :) Lot of dependencies, especially when you have to switch between Intel<->AMD<->NVIDIA drivers, you may spend half a day cleaning you system from unwanted packages.

                                                    1. 1

                                                      And ChromeOS uses Freon

                                                      1. 1

                                                        Yeah - Google, Microsoft, Apple are big enough to convince other players to deliver them drivers for the API they prepare.

                                                    1. 6

                                                      It reminds me old times of Internet Explorer and magic toolbars :) Especially when Atom is based on web browser. Are we back in time?

                                                      1. 4

                                                        Now imagine a world where bitcoin is world 1st monetary unit (e.g. like dollar is today) and someday it will get into “maintenance mode” for a day, two or month. Will world halt for that time?

                                                        1. 4

                                                          With far more serious bitcoin deflationary problem this would be the least of our problems.

                                                        1. 1

                                                          Coincidentally recent Computerphile episode covers somehow this topic, testing on ancient Atari hardware.

                                                          1. 1

                                                            I have a long term project for building car media player - based on RaspberryPi and cheep LCD screen from eBay that fits in original head-up display from Chrysler (that was broken). I plan to use BT audio to send sound from RPi placed in the middle of car to radio in the front. What may go wrong here?

                                                            1. 1

                                                              I’m looking forward to check this on my OrangePi One. Armbian + WiFi dongle was not very stable on this board and I have little hope that NetBSD will be better in this area. All I need is something that can run CUPS over Wi-Fi for simple print server. And I don’t want to spend $$$ for this if I already have all components to build it myself.