1. 26
  1. 26

    In terms of big things, we’ve learned source control, and we’ve learned a bit about what makes it good or bad (no more RCS). We’ve learned that package repositories are good (this won’t get universal agreement, but respectfully, the naysayers are wrong). I think we’ve learned a little bit about what makes a good compiler error message.

    I have a small list of mistakes from programming languages that I think are unlikely to be repeated:

    • a mutable standard library date object (Java)
    • “threadsafe” collections (and StringBuffer) that synchronize on the individual methods (add, get by index, remove), rather than implementing higher level concurrency features (Java)
    • dynamic scope
    • widespread implicit conversions (JavaScript). I suspect some implicit conversions might be considered, but converting arbitrary objects to “[object Object]” isn’t happening again.

    Graydon Hoare has a list of mistakes that Rust avoided, which would be another good source. It’s on his livejournal, but I can’t find it right now.

    In general, I wish more people would spend time accumulating and sharing this kind of wisdom. Too many discussions are closely tied to the ecosystem they happened in, and there’s too much starting from scratch.

    1. 13

      Here’s that post from Graydon you were talking about: things rust shipped without

      1. 3

        I think thread safe dynamic scope should come back as a replacement for module level state. :-)

        1. 1

          In terms of big things, we’ve learned source control

          To this, I’d add that we’ve learned in a systemic way that a large substrate of open source code is an extremely powerful form of leverage. A lot of individuals are, I think, starting to learn the ways we might regret giving the industry that leverage, but I don’t think that’s a lesson the industry itself is really capable of learning.

          That dichotomy sums up a lot of what I think about this: The industry is almost constitutionally incapable of broadly learning a great many of the things that individuals within it gradually come to understand, since so many of them are incompatible with the industry’s continued power and success, and with the ways it acculturates new technologists and bends them to its fundamental processes and economies.

          “It is difficult to get a man to understand something, when his salary depends on his not understanding it.” - well, it’s difficult to get an industrial complex to understand that an industrial complex should be destroyed.

        2. 7

          [This is purely philosophical]

          I think we make a mistake when we call products technologies. While marketing is certainly to blame it’s not like people in the field should fall for that.

          We tend to call software technologies, when it’s product. To my knowledge this is not usually happening in other fields, outside of marketing. Products usually use technologies, but they aren’t themselves technologies. A CS professor once used to call the thing that is a technology in other fields a technique, because it was so convoluted, but I personally think product is a better term.

          When we look at Go, mentioned in the article I’d say it’s a product. It uses/applies technologies/techniques, like CSP, typing, garbage collecting, etc. Rust on the other hand is one of the first (the first?) language to implement a new way of memory safety.

          Another example of this is that famous graph showing how technological advancement speeds up, showing the invention of the transistor, but comparing it to the iPhone. I think that’s a mistake, because while transistors you can buy are indeed a product, inventing transistor technology the iPhone was certainly an innovative product, but it wasn’t a new technology. It certainly used and implemented modern technologies, just like a lot of new programming languages do and some in the process create new technologies, the iPhone might have done so too,

          Of course one has to keep in mind that technology is one of these incredibly broad terms that can mean different things in different contexts, so I wouldn’t say it’s used wrongly, but that because it is so broad and context dependent it’s easy to mix up thing.

          Because of this that is hard to track and I’d argue this is something that is indeed also true in other fields.

          I also don’t think that the software industry should necessarily be considered young. While again I wouldn’t say it’s wrong either one should keep in mind that it kind of split off from electrical engineering, math, and has been around for quite some time. It can be really hard to draw a line when you start to consider something a program. When you consider a modern computer something with state, something that you can run some sort of logic on then you should really consider a lot more than just digital computers.

          What I want to say with that is that technological advancement is very gradual in all fields but still may look very different over time. Take medicine for example. People have been treated for thousands of years, be it by using agents in herbs, or even conducting surgery. Both go to pre-historic times and while hospitals, pills, the scientific method, etc. changed a lot one had to lead to the other and it was a gradual process of learning what works and what doesn’t and also learning to learn what works and what doesn’t.

          Since in Computer Science we work with pure logic that even allows us to create logical illogic if we want to, and comparatively little is affected by the physical world there is a lot of freedom. I think this freedom sometimes is in the way of having standardized, proven ways you see in other fields, simply because there are many more paths one can take. An example of this are programming languages again. Even here there hardly is a way to calling one better or more advanced. Newer, more fashionable (eg. currently that’s easy to deal with JSON and HTTP), etc. for sure, but a really big portion of what is done in IT is the taste of the mainstream. JSON (and most alternatives are in a way just variants on it), HTTP, C style syntax, or even non-LISP or non-APL style languages are more a fashion than anything else. While we build the whole field around these I don’t think there’s any natural reason that these picked up and a large amount of that might be pure coincidence. However with every bit that gets added to what’s common in the field it becomes harder to go a different route, even if it would be better.

          And here I am sure that there might be equivalent processes happening in other fields. The more you consider standard, the less opportunity there is for going different paths, simply because as soon as there is one standard way the whole world (tools, industry, even politics and laws) will build around this, to support this and people will learn that these are givens and might not know it was just something that happened to become popular by chance, marketing, or simply because nobody knew better at the time. However, since all of this is just constructed, one can go a certain path indefinitely, so it might in a way not matter.

          Given that this is a rather philosophical topic though I wonder what would have been if a different road was taken here and there and what kinds of current truths will be considered bad practice and ugly legacy in a few decades.

          1. 6

            Rust on the other hand is one of the first (the first?) language to implement a new way of memory safety.

            Probably the first “production” language to do it, though there were at least a few different research systems that did this kind of thing (e.g. Cyclone, ATS), and the basic type theory underlying the borrow checker goes back to the early 90’s.

          2. 3

            We’ve learned that safety and security do matter, though nobody can yet agree on how much we should care.

            We arguably have surpassed the Physical Security industry, in recognizing that security through obscurity is a bad idea. They still haven’t learned this lesson it seems.

            1. 2

              We as individuals have learned this; we as an industry? That’s a much more difficult claim to make.

              1. 4

                I see your point, and I think it has some merit, but I don’t mean safety and security are top priorities or anything, but they are at least in the realm of oh yea, we should at least pretend to care.

                All the big OS’s are quite responsive to CVE’s and generally get patches out in a hurry. All have teams dedicated to stable, safe and secure computing. All have some form of sandbox, and some of them are even used a little with production software. All have some forms of memory protection active by default. None of this was true 20 years ago.

                The Web Browser mono-culture enforces some bare minimum security and safety when it comes to web browsing.

                TLS is regularly used across the Internet for transport and with TLS 1.2 it even has a good chance of being actually secure. None of this was true 20 years ago(TLS was a security disaster and most places never used it unless the page was asking for a Credit Card #).

                Many programmers don’t try to invent their own crypto anymore and use a high-level library that does all the dirty details(NaCL and friends) without having to type in or understand the words RSA or AES.

                Many programming languages have ways to track CVE’s including in their packaging ecosystem. Some of them are even used out in the real world! Some even care about supply-chain attacks and have some way to verify both code and binaries.

                Many system administrators actively at least attempt to care. Many organizations are now requiring MFA of some flavour on all users to the network (sometimes kicking and screaming because they are effectively required by their cyber-security insurance carriers). Even school districts and small governments are starting to require MFA for their users.

                SOC2 is a thing people ask for now. One could argue SOC2 doesn’t really add security, and they would be right, but it does enhance one’s security posture. A small example: An SOC2 organization at least has to care that users get the permissions they need, and that when people leave their permissions are taken away. I know that’s table stakes for a serious security stance, but it’s definitely progress compared to 10 or 20 years ago.

                Think back 20 years, and security was sort of talked about, but it was whispered in the hallways or at the water cooler. Breeches were hidden and buried faster than you can blink and the exploited hole was rarely patched(if it was even found or looked for).

                Now security being important is actively talked about. Breeches still get hidden and buried, but the exploited hole is usually at least papered over. Insurance carriers are tired of paying out claims and are enforcing security to the laggards that don’t want to be bothered or raising their rates so much as to be prohibitively expensive.

                Certainly there is room for improvement, and one could argue a lot of the stuff added in the name of “security” is wasteful/pointless, but at least people in the industry attempt to care a little bit.

                Overall I would say the average computer user is night and day different in terms of safe, secure computing compared to the 1990’s or 2000’s. Note: I didn’t say Private, that’s an entirely different topic, one that arguably has gotten worse since the 1980’s. :)

            2. 3

              Sadly I think not really, if being able to produce outcomes predictably is an important industry aim.

              As evidence for that, we are no more capable of producing complex software successfully outside the domain of computing and the data sciences, than when I started in computing over 40 years ago. When I say successfully, I mean on time, on budget, and to specification. These are what stakeholders really understand, not the “acceptable on release” standard snuck into the small print of modern large scale undertakings. The failure rate for large, complex commercial, government, and public sector projects remains as high as when I started.

              What we instead have learned is a cycle of:- adopt enthusiastically as a silver bullet; discover limitations; discard: forget; rediscover as new under a new name - with a particular focus on languages, platforms, technologies and the minutiae of computing. These after all are the thing programmers can control… not success.

              1. 3

                When I say successfully, I mean on time, on budget, and to specification.

                That last point is the key one, I think. Even if software developers have learned a lot, it won’t ultimately move the needle all that much in the face of vague, internally-inconsistent specifications. Sometimes project failures are technical failures, there’s no denying it, but in my experience, the majority of failed projects are victims of institutional inertia and indecision and inability to articulate requirements with the necessary precision and thoroughness. Many projects are doomed at the stakeholders’ level, and programmers are pretty much powerless to do anything about that.

              2. 2

                I like the idea of talking about past mistakes and how they relate to today but I think we take a more evolutionary approach in general….

                1. 2

                  @hwayne the “Stories with similar links” thing below seems to think your post is a dupe? Dunno why the dupe detection didn’t catch it, or perhaps I’m mistaken?

                  1. 7

                    I’ve pushed a fair amount of code in the last week or two that touched the dupe warning that shows up during story submission, so it’s real likely @hwayne didn’t see it.

                    1. 1

                      No worries, and FWIW I’m glad he reposted it! It’s a great article and I sent it to one of our internal “Developers should read this” lists :)

                    2. 2

                      Huh, that’s odd. Doesn’t even have a different URL.

                      1. 9

                        Sorry for the bug, I’ve deployed the fix. https://github.com/lobsters/lobsters/pull/1116

                        1. 4

                          Thank you for the transparency.

                    3. 1

                      Yes, slowly.

                      1. 1

                        IDK, I’d say we know a lot more than 100 years ago.

                      Stories with similar links:

                      1. Does the software industry learn? via sjamaan 1 year ago | 8 points | no comments