1. 24
  1. 12

    We write insecure software because our coding environment makes it easier to write insecure software than secure software.

    I don’t think this theory is correct. If you look at things that are “secure” elsewhere in the world, it’s things that are made over and over the same each time and their design has been figured out to be secure. Software is always the first time. By the nature of it, we’re solving a problem the first time almost all the time. The grand design might be a repeat, but there are always specific details that make it unique. Maybe it’s getting something to work on a new OS, or maybe it’s getting it to scale to a new level. Whatever it is, it’s always a bit different.

    1. 2

      I think this is close, but maybe not completely correct. I could write the same program 100 different ways, passing the same set of integration tests, and still have lots of security flaws in it. The simple act of using unquoted text because I was unaware, or not handling a NULL pointer return from a failing malloc. These aren’t design issues, but may have serious ramifications.

      But, asopting the design of previously “secure” software may lead to new software that is less horrible when it is wrong. Perhaps the code with these problems runs in a chroot jail, with dropped privs, a segfault just isn’t so interesting.

      1. 1

        If you look at things that are “secure” elsewhere in the world, it’s things that are made over and over the same each time and their design has been figured out to be secure.

        Do you have an example of something that’s secure elsewhere in the world?

        1. 5

          Building security is mostly figured out, and the insecurities are fairly known. Presidental security is understood. Etc. That isn’t to say these things are infinitely secure, but that they are understood quite well.

          1. 3

            Figured out? Perhaps. Actively being used? Not so much.

      2. 8

        To see what it would be like, I replaced all the int typed return codes and parameters in a C project with strongly typed structs ensuring the wrong value never flowed from here to there. So much pain.

        1. 5

          Did it reveal potential errors in the code?

          1. 4

            No, not at all. Actually, it was part of a refactoring effort that introduced several (but unrelated to this particular change).

          2. 4

            Yeah, I would expect this to be the case. Programming languages often lack good UX design.

            Consider Go, where val, err := failingFunc(), followed by an if is practically required everywhere. There is a shorter way, by doing if val, err := failingFunc(); err != nil, but this forces you to declare val and err before hand and drop the : from the :=, thereby adding back the saved tokens…

            The UX around this sucks, despite the pattern being standard. Can’t imagine the excess noise required in C to do this….

          3. 3

            What’s an example of strongly-typed string concatenation?

            1. 3

              Seems like you could replace the words “insecure software” with “bad code” and “secure software” with “good code” and it’d read like so many essays I’ve seen before. There’s a lot of bad code out there because it’s possible to write bad code that does “good things”, i.e., software that works may be riddled with “bad code”. The reverse is true as well: “good code” might not work (i.e., not do what was desired).

              I won’t go so far as to say that this is unique to our domain, but sometimes it certainly seems like it, i.e., how much “bad code” we can get away with writing because it provides so much value anyway.

              1. 2

                This is a good article, but it seems to be focused more on general code quality than security per se. I would hope that most professional programmers understand the perils of naive string interpolation and CSV (if you can, just use a library, dude). Integer overflow is more of a “gotcha” because slapping a Maybe on every mathematical function is a non-starter. (I’m a fan of static typing but… just, no.) I generally prefer the Haskell solution in which we have a sharp-cornered Int, an unboxed Int#, and a reliably correct but slightly slower (and, unless you’re doing numerical work or systems programming, plenty fast enough) Integer. It might be nicer to be able to set default overflow behavior to be throwing an exception, and have a faster wizards-only Unsafe.Math module, but I’m diverging here…

                Unfortunately, and it’s depressing that I feel a need to say this, I don’t buy into the title’s implicit correlation of modernity (e.g. “It’s 2015”) with software quality. What I’ve seen is that development practices are getting more shoddy and “Agile” (i.e. micromanaged and metrics-focused) over time, and that code quality is plummeting. This was supposed to be an R&D job for highly-compensated and trusted people who’d be given the time to do things right (and be expected to do so). Now, we have commodity founders building commodity startups staffed by commodity engineers… and it’s awful. We, as an industry, are “still” writing insecure software because we’re still writing shitty software, and as long as the arrangement persists in which the majority of paid programmers are plebeian business subordinates, that won’t change no matter how much our tools evolve. If you gave them Haskell, Agile/Scrum commodity programmers would just use unsafePerformIO everywhere… because it’d be too much work to actually understand the language, and who can afford the time when sprint deadlines are coming up? (On a side note, Agile and Scrum must be massively annoying for whoever has the phone number (214) 748-3647.) I mean, I really wish that I could say that I saw development practices and tools becoming better over time… but the fact on the ground is that, until we collectively decide that we’re no longer going to allow ourselves to be treated as business subordinates rather than trusted professionals who’ll do our jobs right, the trend is in the other direction.

                Why do we write shitty, unreliable software? Well, there are the theoretical “software is inherently hard” answers, and even the design trade-offs. Any language that defines Int64 addition to have the type signature of Int -> Int -> Either ArithException Int is going to turn people off– even me. I shouldn’t need to break out monadic do-notation to write a simple math function. Those factors explain why diligent, very good programmers (like most of us, I presume) write imperfect software. Then there is the separate question of why industry-average programmers write terrible, buggy, insecure, unreliable software. The answers to the first question are intellectually interesting and sometimes touch on deep mathematics– reasoning of any kind about arbitrary code is impossible, hence “don’t write, or accept, arbitrary code”– while the answers to the second are parochial and sociological.

                1. 2

                  What year it is, to me is entirely irrelevant. It’s 2015 so what.

                  As I heard someone say in Rectify (tv series): “don’t let all of this technology lull you into believing we live in modern times.”

                  1. 2

                    Interesting how the author dismisses Haskell when it’s much further along the path being argued for - look at the giant “unsafe” you have to put there to do anything, well, unsafe. And yet Haskell is there but even in 2015 people aren’t using it. I don’t think it’s a question of ignorance; we write insecure software because the economic incentives encourage us to. You can’t get around that by making a language in which it’s harder to do the insecure thing, because languages in which it’s easy to do the insecure thing will always exist.

                    1. 1

                      The good code, bad code argument is fitting here. it’s all about experience and how well you can grasp a language. Face it, people: We humans are not smart enough to grasp complex programs. We need programs that do one thing and do it well. Everything on top of that is an abstraction layer.

                      C is only now starting to get mainstream secure and portable interfaces (arc4random(), reallocarray(), pledge(),…). You can’t solve these problems with a strict type system, as this limits the flexibility of the language.

                      I know C is not for everything, but always consider how much you can do with C given how simple it really is.