1. 77
  1.  

  2. 55

    Any tool proponent that flips the problem of tools into a problem about discipline or bad programmers is making a bad argument. Lack of discipline is a non-argument. Tools must always be subordinate to human intentions and capabilities.

    We need to move beyond the culture of genius and disciplined programmers.

    1. 20

      Indeed; this practice strikes me as being uncomfortably close to victim-blaming.

      1. 15

        That’s a good analogy. People like to think they’re good programmers and don’t write buggy code so when faced with the evidence to the contrary in others they defensively blame the other programmer because otherwise they’d need to admit the same thing would happen to them.

        I think these broken arguments persist because their psychology and internal logic forces admitting our own faults which most people find displeasurable so defensive thinking kicks in to avoid it.

        1. -5

          Victim blaming is not a problem. It takes two people to make a person a victim: a bad actor and the actor without adequate protection.

          1. 4

            This is a pretty gross saying, even if it’s nice-sounding and pithy.

            1. -1

              Not really. Every victim chose at some point to be a victim. That is not to say the other party can be absolved of blame. Far from it, the other party is the guilty one.

              Take software. If nobody chooses hard languages, unsafe languages, nobody will be victimized. Choosing those languages and then blaming the language leaves you responsible for your choices, even while the tool chain is at fault

              1. 2

                This is absolutely ridiculous. If I walk down the street and I’m mugged, how did “I choose to become a victim”? There’s many, many cases where someone becomes a victim randomly.

                Your logic applies only if we have some sort of perfect foresight. That’s impossible.

                1. -2

                  When the mugging starts, do you give up? Do you look for an exit? Or do you just hand over your dignity without a further thought? Did you not notice the people before they started mugging you?

                  1. 0

                    People who downvoted as troll, look at Active Self Protection on YouTube for examples of places people choose to be or choose to not be victims

                2. 1

                  Person’s walking down the street. Hell, let’s make them heavily armed, far more “adequately protected” than most people would think is reasonable. A sniper from outside visible range shoots them in the back of the head. They chose to be a victim? Come on.

                  1. -1

                    An exception to prove the rule. Most victimizing isn’t as mismatched, nor as final, as the proposed scenario.

              2. 1

                People who downvoted because Troll.

                Come on. This position is in good faith, and I only bring it because yes, saying a person is at fault for choosing their tools is indeed victim blaming. And victim blaming is not a problem.

            2. 20

              We’re at a point where we already were in the 60’s with cars and in the 90’s with planes. Everything was “driver error”, “pilot error”. In that case, since the results were actual fatalities, at a certain point there was this group of people that basically said: “that’s always going to happen, we’re always going to have human error, get over yourselves - how do we prevent people dying regardless”?

              And that’s how we got seat belts, airbags, telescopic steering wheels, etc. for cars and detailed checklists, redundant systems, etc. for airplanes. I think 2017 was the first year with 0 fatalities for air travel. So it can be done.

              It’s a very difficult mindset issue.

              1. 11

                Tool authors should not blame the programmer, but programmers should not blame their tools. Discipline and responsibility are needed from both.

                1. 8

                  If you’re writing a lot of code in a memory-unsafe language, and find yourself occasionally writing memory- or concurrency-related bugs, and you know there are languages out there which make such bugs impossible or hard to write with no loss in performance or productivity, is it not okay to blame the tool? When should a carpenter stop blaming themselves for using the hammer incorrectly, and just accept that they need a new hammer whose head doesn’t occasionally fly off the handle?

                  1. 3

                    Discipline and responsibility means using the most appropriate tools for the job, being aware of the limitations of those tools, and doing what is necessary to compensate for those limitations.

                    1. 3

                      I agree with that, I work as a C++ programmer specifically because it’s the right tool for the particular job I’m doing. However, we use a safer language (Go in our case, because that’s what we know) for stuff where C++ isn’t completely necessary. If performance and no GC was a higher concern, or if we were on weaker hardware, Rust instead of Go would’ve been very interesting for all the parts where we don’t have to interact with a huge C++ library (in our case WebRTC).

                    2. 2

                      When a carpenter loses a hand they don’t blame the bandsaw. That I see so many programmers blame dangerous tools for being dangerous means were not even reached the level of a craft yet. Let alone an engineering discipline.

                      1. 18

                        I appreciate the analogy, but it doesn’t really apply. First of all, there is no tool that can replace a bandsaw for what a bandsaw does well. However, any worker who uses a bandsaw recognizes that it’s a fundamentally dangerous machine, and they take safety precautions when using it. If the analogy really applied, it would be considered unthinkable to write C code without a full suite of static analyzers, valgrind test suites, rigorous (MISRA C-level) coding standards, etc. Second, and more importantly, the saying applies to the quality of tool, but sometimes the tool is simply too dangerous to use: good carpenters will refuse to use a tablesaw without an anti-kickback guard or other safety features. Finally, when there was another tool that would do they job just as well, they’d use it.

                        1. 9

                          https://schneems.com/2016/08/16/sharp-tools.html

                          this is not how tools work, either in programming, or in carpentry

                          1. 3

                            This is a fantastic essay and you should submit it as a story.

                          2. 8

                            When a commercial carpentry shop has a carpenter lose a hand to a bandsaw, they are more or less forced to stop and try to make the bandsaw safer. This might be done by adding safety features to the tool, or by redesigning the process to use it less/differently, or by ensuring that the workers are trained in how to use it correctly without risk and are wearing proper safety equipment.

                            It’s not the carpenter’s fault, and it’s not the bandsaw’s fault, it’s the fault of the system that brings them together in a risky manner.

                            1. 4

                              The company should not blame the employee or the bandsaw, and the employee should not blame the company’s training or procedures or the bandsaw. Discipline and responsibility are needed from both. That includes the discipline and responsibility needed to make the bandsaw, training, and procedures as safe as is possible, and to only certify employees who are capable of operating the device safely, and the employee’s discipline and responsibility to follow the training and procedures properly and to not defeat the safety measures.

                              Assigning blame is useless. The focus should be on identifying all root and proximal causes, and eliminating each one, with priority chosen based on the heirarchy of values.

                    3. 36

                      You cannot fix things with discipline.

                      1. 13

                        Rust is superb for these cases, where proper memory management is important. Even preventing you from creating accidental data races, unlike runtime languages such as Java or C#.

                        I agree that you can’t blame developers for such bugs. The best of the best developers make mistakes, as bigger applications can become super complex when talking about memory.

                        I do blame developers for complaining about memory bugs in C(++) programs though, when choosing a language guaranteeing memory safety like Rust was/is a viable option.

                        1. 1

                          Was/is it, though? How often do programmers actually get to choose programming languages or tools to a project? From my (limited, sure) experience, not very often.

                          Also, technical merit is not the only parameter in determining the best tool for a job. New languages have a adoption cost, and a maintenance cost (i.e., how hard is it gonna be to hire more people to work with it). This has to be factored in with deadlines, and since times is money, the final decision is pretty much never made by the programmers.

                          So, no, I don’t think it’s valid to blame programmers for “choosing” c/c++ when that decision is so often not made by them. At least it’s not ok without some extra qualifiers or something.

                        2. 12

                          This is a bit of an aside from what’s discussed in this article, but there’s something I’m would like to know about the thing that this article cites: how many UB instances are integer under/overflow problems? Are integer bugs counted as memory bugs or were they accounted for separately?

                          The reason I’m curious is because integer safety in C seems to me a lot harder than avoiding going off the ends of arrays. Adding two big or two small numbers together is UB. Negating INT_MIN is UB. Arithmetic expressions in which all the explicitly-declared variables are harmless-looking unsigned integers… can have subexpressions silently promoted to int, which is signed, and overflow, and invoke UB.

                          I think it’d be reasonable for the study to have included most or even all integer bugs in the “memory bugs” category because UB due to integer overflow can immediately result in memory oopsies, like “this number which overflowed is being used to compute the size for a call to malloc(), resulting in us trying to access a large index on a small array”. From the C abstract machine’s point of view, it doesn’t even make sense to distinguish them: UB is UB.

                          1. 6

                            I think rust makes collaboration on multi threaded code much easier, it has been very hard for me in the past to read multi threaded code written by other developers. Humans read code sequentially, but multi threaded code has another dimension. A clean design helps, but real world code is often not ‘clean’ in this way.

                            1. 5

                              Humans make mistakes, that’s part of being human. If we want to produce artifacts with few mistakes we need a system –that is, a machine made out of humans– to attempt to reliably find these mistakes and make them rare enough to be acceptable for our desired use case.

                              Other engineering disciplines have these systems; see how many double-checks, tests, sign-offs and certifications go into building a bridge. Software developers can do this too, there’s a great write-up somewhere about how NASA develops software. We know how to do it, it’s just that actually getting reliable software is an order of magnitude or two more work (and money) than getting a bunch of coders together to flail away at a problem, and frankly isn’t as fun, so people seldom do it. And there’s the popular mythos of the rock-star hacker that works against the spread of having a culture of resilient systems engineering; Neal Stephenson never wrote any books about rockstar civil engineers.

                              Rust and other automatic systems that look for human mistakes, such as fuzzers and unit tests, are useful for building resilient systems more cheaply and easily. So by making them easier to use and using them more widely, we raise the floor of how crappy the worst code can be.

                              1. 1

                                If we want to produce artifacts with few mistakes we need a system –that is, a machine made out of humans– to attempt to reliably find these mistakes and make them rare enough to be acceptable for our desired use case.

                                I don’t disagree entirely, but it’s not impossible to have flawless software, which this implies.

                                Other engineering disciplines have these systems; see how many double-checks, tests, sign-offs and certifications go into building a bridge.

                                Software is not constrained to the physical world as these systems are.

                                Software developers can do this too, there’s a great write-up somewhere about how NASA develops software. We know how to do it, it’s just that actually getting reliable software is an order of magnitude or two more work (and money) than getting a bunch of coders together to flail away at a problem, and frankly isn’t as fun, so people seldom do it.

                                This is a mistaken angle to look at it. It’s entirely possible to use better languages, which Rust is not, to write software for critical infrastructure. If you think Rust is an example of this, I suggest you take a look at Ada and what it does.

                                And there’s the popular mythos of the rock-star hacker that works against the spread of having a culture of resilient systems engineering; Neal Stephenson never wrote any books about rockstar civil engineers.

                                It’s my experience the best hackers use good languages and have a good head, whereas the average programmer may have one or the other, and the poor programmer has neither and pretends it’s impossible.

                                Rust and other automatic systems that look for human mistakes, such as fuzzers and unit tests, are useful for building resilient systems more cheaply and easily. So by making them easier to use and using them more widely, we raise the floor of how crappy the worst code can be.

                                See here for my thoughts on fuzzing and how it is naught but an extra system for exhaustive testing, just because it’s available. If you rely on a fuzzer to find bugs, you don’t know what your program actually does.

                                In closing and to repeat, look at Ada for an example of a language actually designed to prevent common bugs. Rust’s symbol vomit syntax is another point against it, while I think of it.