1. 7

    Cory always scares me.

    1. 21

      This was from 2012. Arguably, we’re already there. Tons of popular computers run signed bootloaders and won’t run arbitrary code. Popular OS vendors already pluck apps from their walled garden on the whims of freedom-optional sovereignties.

      The civil war came and went and barely anyone took up arms. :(

      1. 5

        It’s not like there won’t always be some subset of developer- and hacker-friendly computers available to us. Sure, iPhones are locked down but there are plenty of cheap Android phones which can be rooted, flashed with new firmware, etc. Same for laptops, there are still plenty to choose from where the TPM can be disabled or controlled.

        Further, open ARM dev boards are getting both very powerful and very cheap. Ironically, it might even be appropriate to thank China and its dirt-cheap manufacturing industry for this freedom since without it, relatively small runs of these tiny complicated computers wouldn’t even be possible.

        1. 9

          This is actually the danger. There will always be a need for machines for developers to use, but the risk is that these machines and the machines for everyone else (who the market seems to think don’t “need” actual control over their computers) will diverge increasingly. “Developer” machines will become more expensive, rarer, harder to find, and not something people who aren’t professional developers (e.g. kids) own.

          We’re already seeing this happen to some extent. There are a large number of people who previously owned PCs but who now own only locked down smartphones and tablets (moreover, even if these devices aren’t locked down, they’re fundamentally oriented towards consumption, as I touched on here).

          Losing the GPC war doesn’t mean non-locked-down machines disappearing; it simply means the percentage of people owning them will decline to a tiny percentage, and thus social irrelevance. The challenge is winning the GPC war for the general public, not just for developers. Apathy makes it feel like we’ve already lost.

          1. 0

            Arguably iPhones are dev friendly in a limited way. if you’re willing to use Xcode, you can develop for your iPhone all you want at no charge.

            1. 7

              Develop for, yes, within the bounds of what Apple deems permissible. But you can’t replace iOS and port Linux or Android to it because the hardware is very locked down. (Yes, you might be able to jailbreak the phone through some bug, until Apple patches it, anyway.)

              Mind you, I’m not bemoaning the fact or chastising Apple or anything. They can do what they want. My original point was just that for every locked-down device that’s really a general-purpose computer inside, there are open alternatives and likely will be as long as there is a market for them and a way to cheaply manufacture them.

              1. 4

                Absolutely! Even more impressive is that with Android, Google has made such a (mostly) open architecture into a mass market success.

                However it’s interesting to note that on that very architecture, if you buy an average Android phone, it’s locked down with vendorware such that in order to install what you want you’ll likely have to wipe the entire ecosystem off the phone and substitute an OSS distribution.

                I get that the point here is that you CAN, but again, most users don’t want the wild wild west. Because, fundamentally, they don’t care. They want devices (and computers) that work.

                1. 6

                  Google has made such a (mostly) open architecture into a mass market success.

                  Uh, I used to say that until I looked at the history and the present. I think it’s more accurate that they made a proprietary platform on an open core a huge success by tying it into their existing, huge market. They’ve been making it more proprietary over time, too. So, maybe that’s giving them too much credit. I’ll still credit them with their strategy doing more good for open-source or user-controlled phones than their major competitors. I think it’s just a side effect of GPL and them being too cheap to rewrite core at this point, though.

                2. 2

                  I like to think that companies providing OSes are a bit like states. They have to find a boundary over how much liberty over safety they should set, and that’s not an easy task.

                3. 3

                  This is not completely true. There are some features you can’t use without an Apple developer account which costs $100/yr. One of those features is NetworkExtension.

                  1. 2

                    friendly in a limited way.

                    OK, so you can take issue with “all you want” but I clearly state at the outset that free development options are limited.

            2. 6

              Over half a million people or 2 out of 100 Americans died in the Civil War. There was little innocent folks in general public could do to prevent it or minimize losses Personally, I found his “civil war” to be less scary. The public can stamp these problems out if they merely care.

              That they consistently are apathetic is what scares me.

              1. 5

                Agreed 100%.

                I have no idea what to do. The best solution I think is education. I’m a software engineer. Not the best one ever, but I try my best. I try to be a good computing citizen, using free software whenever possible. Only once did I meet a coworker who shared my values about free software and not putting so much trust in our computing devices - the other 99% of the time, my fellow devs think I’m crazy for giving a damn.

                Let alone what people without technical backgrounds give a damn about this stuff. If citizens cared and demanded freedom in their software, that would position society much better to handle “software eating the world”.

                1. 6

                  The freedoms guaranteed by free software were always deeply abstruse and inaccessible for laypeople.

                  Your GNOME desktop can be 100% GPL and it will still be nearly impossible for you to even try to change anything about it; even locating the source code for any given feature is hard.

                  That’s not to say free software isn’t important or beneficial—it’s a crucial and historical movement. But it’s sad that it takes so much expertise to alter and recompile a typical program.

                  GNU started with an ambition to have a user desktop system that’s extensible and hackable via Lisp or Scheme. That didn’t really happen, outside of Emacs.

                  1. 6

                    Your GNOME desktop can be 100% GPL and it will still be nearly impossible for you to even try to change anything about it; even locating the source code for any given feature is hard.

                    I tried to see how true that is with a random feature. I picked brightness setting in the system status area. Finding the source for this was not so hard, it took me a few minutes (turns out it is JavaScript). Of course it would have been better if there was something similar to browser developer tools somewhere.

                    Modifying it would probably be harder since I can’t find a file called brightness.js on my machine. I suppose they pack the JavaScript code somehow…

                    About 10 years ago (before it switched to ELF) I used Minix3 as my main OS for about a year. It was very hackable. We did something called “tracking current” (which apparently is still possible): the source code for the whole OS was on the disk and it was easy to modify and recompile everything. I wish more systems worked like this.

                    1. 6

                      Remember when the One Laptop Per Child device was going to have a “view source” button on every activity?

                      1. 1

                        Oh yes, that would have been so nice…

              2. 3

                Cory always brings so much more work that needs to be done to the table.

              1. 9

                I recently told my boss I could rewrite 80% of our project from scratch in a few months, with two caveats.

                1. I choose which 80% of the features
                2. I’m not making any promises about the other 20%
                1. 4

                  Is that really helpful at all?

                  1. 1

                    Nope. He and I both knew that the ugly 20% that I would exclude is actually indispensable from the customers’ perspective. We were talking about our ongoing efforts to modernize the product, and I brought it up to illustrate the same point that this blogger does.

                1. 0

                  The new computer had a one-plus-one addressing scheme, in which each machine instruction, in addition to the operation code and the address of the needed operand, had a second address that indicated where, on the revolving drum, the next instruction was located.

                  In modern parlance, every single instruction was followed by a GO TO! Put that in Pascal’s pipe and smoke it.

                  – The Story of Mel

                  Pretty crazy! I’m sure glad we don’t do anything like that any more.

                  More specifically, the attacker first finds usable gadgets in the victim binary. She then uses a buffer overflow vulnerability to write a sequence of addresses of gadgets into the victim program stack. Each gadget performs some computation before executing a return instruction. The return instruction takes the return address from the stack, and because the attacker control this address, the return instruction effectively jumping into the next gadget in the chain.

                  – Spectre Attacks: Exploiting Speculative Execution

                  Everything old is new again.

                  1. 33

                    In other news, resistors and capacitors are keeping kids from electronics, and getting their hands icky is keeping kids from cooking.

                    It’s been proven that saying programming is easy is a bad idea, it isn’t. Also all instructions given to a computer in anyway get translated at some point or other into machine code (best represented by lists of text instructions if you want a human to read it). Pretending that’s not true doesn’t seem useful.

                    It would be better to frame this conversation in terms of the group of people who want to make computing in some way accessible to other people. What are they trying to achieve and what do they want out of this ‘coding’ experience. Let’s just stop pretending that that has anything to do with programming, which is always going to be processing lists of instructions.

                    1. 24

                      The whole point of the article is that their playtesters were intimidated from starting the game because the text UI looked too scary, but if the game started with icons and gradually switched over to text kids had no trouble with it.

                      1. 14

                        That’s funny because for me it was exactly the other way around. Ok, I was a bit older than five years old, twice that even. Still a young kid. Back then I discovered mIRC and learned how to write mIRC scripts. To this day IRC has been my favorite protocol: next to being able to easily talk to each other, the event based potential is just very appealing.

                        (Ok, anekdote time. If there was an ability to hide the rest of my comment underneath a “read more” link, this is where you could click. But since there isn’t any such functionality, the best I can do is offer my apologies to anyone who is displeased by the length of my comment. Clicking on the [-] next to my nick will hide the entire thing.)

                        Maybe a year or so later if memory serves me right — I can’t have been much older than 12 because I can still remember both the old house where I used to sneak into the other room to steal, err, borrow my stepdad’s external ISDN modem and extremely long phone cable; and the monthly fights we had about the phone bills — an online friend told me that you didn’t have to use an IRC client (intended to be used interactively) for automation: someone named Robey Pointer had designed IRC bot software called ‘eggdrop’.

                        (My friend also kept going on about how envious he was of Robey’s last name which initially made me reconsider our friendship, it sounded so weird that it made me unsettled – many months later a different friend unsettled me again when he told me about pointers, causing me to realize that my first friend actually might not have been that weird at all)

                        This ‘eggdrop’ thing made me very curious and I couldn’t wait to give it a go. So I got onto my 486SX with Windows 95, downloaded and extracted the zip file, executed eggdrop.exe and.. wait.. what was this? All I saw was a command prompt that did appear to show some text, but it disappeared before I could make sense of it.

                        So I opened up a command prompt myself, typed ‘eggdrop.exe’, and read something cygwin something, and about there not being a user file, et cetera. Alright, my curiosity was now really piqued.

                        Eventually I got it running and was very pleased with myself – anyone who’s ever configured an eggdrop knows that it’s not exactly trivial, and even more so when most of it is way outside the things you know about.

                        Then I discovered that it actually wasn’t for Windows at all. Thanks to the integrated Cygwin it worked, but as it turned out most people would use something entirely different from Windows. Called Linux.

                        And that’s how I discovered and got fond of Linux. First RedHat, then SuSE, then Slackware… and to this day, 20 years later, I still enjoy using the terminal, and IRC, and all kinds of other exotic things that people call anything from unappealing, to downright scary, with ‘complicated’ somewhere in the middle. I don’t think it’s any of that. All one needs is a curious mind, the ability to disregard what other people say about it, and fun ideas. I didn’t care about how esthetically pleasing any of it was, at all. I cared because of how thought provoking it was; a new dimension, and mine to explore.

                        To me this is where the heart of the matter is and I think it’s where most people are mistaking - in my view, computers in and of themselves aren’t fun. Computers shaping people aren’t fun either. People shaping computers is where the fun’s at.

                        What makes working with computers so intriguing, especially from an engineering perspective, is that you’re immersing yourself into a world of codified thoughts made entirely by human brains. The choices, and even psychology behind things. It’s like a labyrinth: intelligently designed by humans, so there simply has to be logic to it.

                        Discovering the logic. That’s the appeal.

                        1. 7

                          I was a bit older than five years old, twice that even. Still a young kid.

                          That’s a huge difference in terms of reading skill. Maybe bigger than the additional reading skill you acquire between ten and twenty years old.

                          I think this article has more to do with early childhood development than it does with teaching programming in general. The more I think about, the more I’m impressed that children that young were able to do any kind of programming.

                          And the more I think about it, the more I hate the title.

                          1. 2

                            Agreed. Plus, the more I think about it, the less I trust the research: you can never go wrong with 5 year old subjects. Totally unreliable.

                            But, gah, I’ll admit. I wrote most of my comment before realising how old they were. So that explains that.

                          2. 3

                            This was very unexpected and gratifying; thanks. (My last name got a lot of laughs in programming classes, too.) :)

                            On topic, I think “the thrill of solving puzzles” is a big part of what got me into coding as a kid, too. I still remember Silas Warner’s “Robot War” and trying to figure out why one bot always won. Part of the disconnect here may be that 5 is too young for most kids to find fun in written language, so the puzzles have to be scaled down. But my instinct agrees with yours: dropping a puzzle in front of a kid and saying “this should work, if you can figure it out” is usually a great way to motivate them to learn something.

                          3. 4

                            It’s more likely a lack of autocomplete and red squigglies. Also having a cheat sheet on hand would probably help when getting started.

                            1. 3

                              “Danny! Don’t eat the cheat sheet!”

                              5 year olds ¯\_(ツ)_/¯

                          4. 11

                            I’m sympathetic to text, and to programs as lists of instructions, and I totally agree that saying programming is easy is counter-productive. But I feel I’m missing something about your argument given the obvious holes, so can you elaborate?

                            1. I don’t understand this distinction you’re making between ‘programming’ and ‘coding’. To the extent that programming isn’t accessible, I think we should be changing programming to be more accessible. Creating new distinctions seems unnecessary, and also inherently a political rather than technical act: even if you don’t intend to, you’re liable to end up creating us-vs-them divisions between ‘programmers’ and ‘coders’.

                            2. Programs always eventually get translated to zeros and ones. Surely that doesn’t mean we should be programming in zeros and ones? You’re obviously aware of expression-oriented languages like Hy. Similarly, translating icons to addresses doesn’t seem much different to me than translating words to addresses. What am I missing in your argument?

                            One of my students is contributing to this project which assigns icons to new procedures by default. It seems useful.

                          1. 19

                            I almost entirely disagree with this. The only thing I’ll grant is that you can’t always tell the exact same story in a video game as you can in another medium (Doh!).

                            For instance, Papers Please is a game about checking paperwork as an immigration inspector. It would make a fairly boring book or movie, but as a game, it tells an interesting story that’s entirely influenced by every one of the player’s decisions.

                            There are plenty of other examples of this type of game, and its really a disservice to the medium to choose a small subset of AAA action games that are attempts to copy Hollywood films, then blame video games for being poor imitations of Hollywood films.

                            1. 4

                              Papers, Please gives you an emergent story, which I love and which games can do better than any medium. I think the article could have been better titled, “Video Games Are Better Without Scripted Stories.”

                              Even then, there are a few scripted games stories that I love. I do feel like they’re the exception.

                              1. 3

                                I think this critique misses the crux of the article, which criticizes specific movements in artistic video game storytelling through understanding the relationship of storytelling to the rules and problem solving we consider gameplay. Specifically, it’s exploring the investigatory style of facade-like “storytelling through environment” techniques that dominates game storytelling (Papers Please still falls under this kind of storytelling), which holds its own shortcoming by creating a foundational contradiction between the mechanical and fictional.

                                AAA games especially suffer from this contradiction because of their publisher’s interests in cinematic industry profits and disinterest in conceptual artistic problems related to storytelling. And this is made especially interesting because of the massive influence and cultural importance of AAA games. It’s a useful examination.

                              1. 8

                                Unmentioned: the Chicago Tribune website has been steadily declining in quality. I won’t argue the quality of writing, but there are constantly more ads (pulling up the top story, my ad blocker caught a dozen items and there were still four on the page). Around the time of this change, they introduced a paywall that limits you to X articles per month and has an undismissable modal telling you turn off your ad blocker (few blocker filters catch this modal).

                                Facebook watches everything. If they notice the rate of users clicking through to ChicagoTribune.com and bouncing back to Facebook.com in a couple seconds has lept up, they’re going to consider it a low-quality site and serve it less often. Facebook probably never made a decision to reduce the frequency of links, or of news links, or of links to the Trib - this is probably just a general quality filter doing its job.

                                1. 2

                                  But what about their parent company’s plan to focus “on leveraging artificial intelligence and machine learning to improve the user experience and better monetize our world-class content in order to deliver personalized content to our 60 million monthly users and drive value for all of our stakeholders. Our rebranding to tronc represents the manner in which we will pool our technology and content resources to execute on our strategy.”

                                  Surely that would have improved their site’s quality by now?

                                1. 4

                                  I was surprised to see so many are leaving Lisp for Python… but then I remembered one popular company did exactly that, and everybody spent the next decade rehashing the decision.

                                  1. 12

                                    I love Lisp, truly, but for modern workloads, the ecosystem is there for Python and it’s not there for Lisp (which Lisp? Common Lisp? Scheme? Which implementation of CL?).

                                    If I want to connect to some Amazon API, there’s already a library in Python that’s well-supported, well-documented, and has tens of thousands of users. If I want to do that in Lisp, there may or may not be a library that may or may not work and may or may not be kept up to date. Python has networking and encryption in the standard library, etc, etc.

                                    I remember reading an article by…Paul Graham?…about writing early web apps in Common Lisp (I think it was the original Yahoo! Market, but don’t quote me on that). All of the things he talked about were revolutionary, and he was right that nobody could do what they were doing at the time…but that was because nobody had written it in any other language either. When you’re starting from absolute scratch, Lisp will win, but very, very few projects are started absolutely from scratch anymore.

                                    1. 3

                                      I remember reading an article by…Paul Graham?…about writing early web apps in Common Lisp (I think it was the original Yahoo! Market, but don’t quote me on that)

                                      You remember correctly. http://paulgraham.com/avg.html

                                    2. 3

                                      Most things in technology are there because of legacy and overriding commercial imperatives.

                                      Alas, very seldom in this industry does the best technology win.

                                    1. 6

                                      Is this article a troll?

                                      1. 4

                                        It did a nice job of dancing on that line so I couldn’t decide. Until I saw those numbers. $15 an hour? Definitely a troll.

                                      1. 3

                                        This person’s lab practices terrify me. I’m surprised he hasn’t described any hospital trips.

                                        1. 2

                                          He sounds very smart and also very stupid.

                                          1. 2

                                            The lack of hospital visits sounds like it’s attributable to, besides a bit of luck, some pretty serious safety equipment in the post-high-school examples: full-body hazmat suit, respirator mask with faceplate, etc.

                                          1. 5

                                            It would be nice if some of this were true. But we aren’t living in an era of fragmentation; we’re living in an era of consolidation. The rate of new business formation collapsed in the recession and hasn’t recovered. The economy is increasingly dominated by big, old firms.

                                            https://www.washingtonpost.com/news/on-small-business/wp/2015/02/12/the-decline-of-american-entrepreneurship-in-five-charts/

                                            http://fivethirtyeight.com/features/atts-merger-could-be-a-bad-sign-for-the-economy/

                                            1. 1

                                              The best part is Carmack’s comment on the article. He isn’t so anti-STL any more, even if he still doesn’t love it.

                                              http://kotaku.com/thanks-a-few-comments-in-some-ways-i-still-think-the-454293019

                                              1. 6

                                                I personally prefer µBlock, since it’s compatible with AdBlock-style filters while providing the option to only block cross-site stuff (as opposed to cosmetic blocking). It’s more of a privacy-centric approach, as opposed to the “all ads are bad” approach taken by the AdBlock devs that essentially breaks online monetization.

                                                1. 2

                                                  EFF’s Privacy Badger is another one that focuses on cross-site content and trackers rather than ads per se. It does require a bit of fiddling though, because it sometimes breaks pages that require cross-site loading for functionality, e.g. an embedded google widget of some kind (you can override that on a site-by-site basis).

                                                  1. 1

                                                    How does Privacy Badger compare to telling Chrome to block third-party cookies?

                                                    1. 2

                                                      I use all 3. uBlock Origin. , Privacy Badger and a cookie white list.

                                                      The cookie white list has had a huge impact on my habits. I don’t have hard data, but my Amazon spending dropped dramatically once cookies were blown away!

                                                      1. 2

                                                        It blocks more than just the cookies; it also blocks (some) third-party content from loading, to make it harder to do even server-side tracking (many of these trackers don’t strictly depend on cookies, using other techniques like browser fingerprinting and IP logging). Uses some heuristics about what’s likely to be doing cross-site tracking.

                                                        For example, when I loaded cnn.com just now, it blocked things from 17 domains from loading: aax.amazon-adsystem.com, www.budgetedbauer.com, staticxx.facebook.com, www.facebook.com, cdn.gigya.com, partner.googleadservices.com, secure-us.imrworldwide.com, vrp.outbrain.com, vrt.outbrain.com, widgets.outbrain.com, a.postrelease.com, pixel.quantserve.com, ads.rubiconproject.com, consent.truste.com, www.ugdturner.com, ad.doubleclick.net, and cdx.krxd.net. It allowed a further 3 domains to load widgets but without allowing cookies: static.chartbeat.com, cdn.optimizely.com, cdn3.optimizely.com.

                                                  1. 5

                                                    SQL at #24, just ahead of Haskell? Oh-kay.

                                                    1. 2

                                                      As a DBA, I can assure you that very few developers actually engage with SQL, preferring to hit the DB through an ORM or other SQL wrapper instead. On the one hand, it’s frustrating, because there are still plenty of those “looping over ORM calls generates 10k queries where one would do” situations. On the other, it’s often fairly easy to tune something if they’re willing to embed raw SQL in their code (usually raw SQL that I write, mind you).

                                                    1. 2

                                                      My ultimate goal when work on a difficult project is that when problems arise, as they always do, it should require almost no effort to pinpoint and fix the problem.

                                                      “I don’t maintain legacy software that was architected by anyone less wise than me,” he bragged.

                                                      1. 5

                                                        I don’t think it should be that hard for him to get his coding chops back. The harder problem might be re-learning how to focus on a job in the first place. Six years of doing nothing is a long time to form habits.

                                                        1. 9

                                                          Getting stuck too easily in a local maximum is bad. Searching indefinitely for the global maximum is bad. You just have to be willing to stick most of the time, and hop around other times. Now I just need a name for it… pretend tempering?

                                                          1. 18

                                                            “But the plans were on display …”

                                                            “On display? I eventually had to go down to the cellar to find them.”

                                                            “That’s the display department.”

                                                            “With a torch.”

                                                            “Ah, well the lights had probably gone.”

                                                            “So had the stairs.”

                                                            “But look, you found the notice, didn’t you?”

                                                            “Yes,” said Arthur, “yes I did. It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying Beware of the Leopard.”

                                                            1. 5

                                                              Ha, no argument Apple could be more transparent. But the initial post (and comment thread, for that matter) could have been much more informative without the “they’re coming to steal your datas” speculation. “How to safely use Apple Music” would be a great post.

                                                            1. 1

                                                              Just wow. And yet his column in boot was so cool. Anybody remember that?

                                                              1. 18

                                                                ? means save and 刀 means sword. After a while, you don’t need a reason.

                                                                1. 18

                                                                  Give me a minute to put on my tartan kilt, because no true statically typed language would leave it until run time to discover that your code doesn’t handle a null pointer in that one spot.