1. 26
  1.  

  2. 17

    This is a good complaint, but I think we need some suggestions on how we can convince programmers to be less stupid, because telling them to be less stupid just isn’t working.

    1. 4

      Use tools with less features and more interoperability. Fork complicated projects and rip out useless features and call it “Lite/Secure”.

      1. 4

        Alas, interoperability is often itself a problem.

        1. 1

          In what way? I thought do one thing well was all about delegating out tasks to programs which have devoted the time and energy into being secure. Could you please elaborate?

          1. 7

            The “delegation” is itself a potential vulnerability. That’s not to say it has to be, but once tools can interoperate, people expect them to interoperate, and that causes trouble. Consider the venerable less program, which can execute dozens of not entirely secure helpers. They are each individually vulnerable, but less, through the powers of delegation and interoperability, has found a way to become the sum of their vulnerabilities.

            http://marc.info/?l=full-disclosure&m=141678420425808&w=2

      2. 3

        Well, the funny thing is that while we can create tools to enforce security with an application (that is, to make sure an application is doing what it wants to do correctly, without introducing security holes), it is harder to restrict ourselves as programmers to stop our tools from growing options outside of their original scope (that is, to make sure our applications do only what they should do, from a philosophical perspective).

        Both of these are security problems, and it’s something people try to get at with the term “attack surface.” When you add options, or add semi-out-of-scope convenience features to applications, you also increase the complexity of verifying the application is free of security flaws, and you increase the likelihood that some interaction between the application’s components or between the application and other applications will introduce security flaws.

        All of this is to say that I don’t know if there is a way to do this that’s any better than teaching people, through hard-learned examples from history, that there are right and wrong ways to do this. If we start to view security flaws like failures in civil engineering, covering them in education with case studies and discussions of what went wrong, and with real care for pre and post deployment review of correctness and failures, then we might start to see things improve. In this light, something like Heartbleed becomes the Tacoma Narrows Bridge collapse. Something we teach new developers about, not just for the specific lessons of the failure, but the learned wisdom of how to avoid the same mistakes in a systemic fashion.

        1. 2

          Thinking of a list of non goals (e.g. https://github.com/martanne/vis#non-goals) at the beginning of a project sure helps.