1. 35
  1.  

  2. 2

    I got a good chuckle out of the tarsnap example.

    On a side note, at first I was confused, thought I was going to @tedu’s blog, then realised it was just a different colour scheme. Looks good.

    1. 1

      Hey thanks. Btw, I’d welcome a few more examples of logic errors. Buffer overflows are boring. Something that a random apple TLS tweet would also apply to.

      1. 3

        When I was writing crypto code at a previous job, I saw my fair share of broken code. The “Regular OpenSSL” example was a common one (i.e. expecting zero to be returned on success). Unfortunately, all of that code was closed source and it’s been a while since I saw it. It’s part of the reason I’m big not only on testing, but on testing failures as well. That is, I saw a lot of testing code that only made sure the code worked where expected. Particularly where security code is concerned, but obviously useful elsewhere, it’s important to make sure code fails where it’s supposed to. That would catch a significant amount of the broken security code I’ve seen. Most of those bugs were not one line fixes.

        Interestingly enough, I never saw failures due to misplaced gotos and only a few instances of missing brace problems; what I did see a lot of was a failure to understand secure systems (like using hashes inappropriately). I attribute that to working with talented C devs who just weren’t experienced with writing secure code.

        Security is a field where I think the industry would do well to adopt an apprenticeship-style approach. You don’t want to let novices work with production security code, but you do want to bring up the next generation of secure coders. I was fortunate in that I was basically in that position, and I was able to learn from my mentor and make the mistakes that weren’t pushed to production. The security team also held workshops in-house to make other developers aware of common pitfalls. I’d say that approach worked well for us.

        1. 1

          One other thing, too, that I was just reminded of on Twitter: sanitising sensitive data. A lot of developers, even some skilled ones, failed to consider what would happen when a variable went out of scope. I constantly harped on making sure any sensitive data was wiped once it was no longer needed, often times even before error checking in the case of stack-allocated variables. For example, something like

          char temp_key[AES_KEY_SIZE];
          int  retval = -1;
          
          // ...
          
          retval = crypt_operator(&key, buf, buf_len);
          memset(temp_key, 0, AES_KEY_SIZE);
          if (retval != CRYPTO_SUCCESS)
                   goto cleanup;
          
      2. 2

        Probably appropriate to slap the “satire” tag on this submission.

        1. 2

          “They all date from before 2013. That’s how we know the NSA wasn’t involved.”

          This is quite a silly assertion that is 100% untrue. The NSA has existed pre-2013 and there is documentation of their activity. At the very least your closing sentence is disingenuous.

          1. 12

            He was obviously joking.

            1. 2

              I had just woken up, give me a break :)

              1. 2

                True, but he does come off as incredibly arrogant in the process.

              2. 3

                If we posit that the NSA introduced the Apple bug, then logic demands we look back at the past ten years' worth of bugs and assume they are sabotage as well. There are a lot of unburnt witches running free.

                It’s a response to the people demanding Apple identify and fire the person responsible.

                1. 1

                  I would argue that logic demands we look back at the past ten years' worth of bugs of companies who were as directly involved with the NSA as apple was, not all bugs. What good would Jill OSSdev’s bug data be?

                  1. 3

                    If anybody is being insufficiently paranoid, I think it’s the people who think the NSA restricted their activities to the set of companies whose logos fit on a PowerPoint slide.

                2. 1

                  After reading the tarsnap note, I thought he was making a joke.

                3. 1

                  The description of the tarsnap bug is not quite right, it has nothing to do with the goto, it has to do with the nonce:

                  http://www.daemonology.net/blog/2011-01-18-tarsnap-critical-security-bug.html

                  But I’m not sure if the author was just making a joke there or not.

                  The more serious point though:

                  Why on earth do security developers write code like this?!? Why would a security developer think that doing two things in 1 statement is a good idea and adds clarity? If the nonce was modified and then init called and then compared to error, these algorithmic bugs may slip through, but at the very least be incredibly obvious on a read through. The same goes for {}s in ifs. Just use the extra line! The compiler won’t mind, I promise!

                  I have never written security code so perhaps I’m just being naive, but security code seems so obviously to benefit from clarity.

                  1. 1

                    Here’s the best explanation. I couldn’t explain it better myself.

                    http://www.reddit.com/r/programming/comments/1zamk9/a_brief_history_of_one_line_fixes/cfs647y

                    1. 1

                      Yeah, since it’s the internet I explicitly left the idea of it being a joke open. That being said, I think the general discussion of valid programming practices in security software is a great thing to come out of this sarcastic post.

                      1. 1

                        If there’s anything I could say to summarize, it would be “it’s not that easy, not even the easy stuff.”

                        Why on earth do security developers write code like this?!?

                        Because they like it? Adding the closing } eats an extra line, and if there’s one resource I’m short of, it’s vertical resolution. And in practice, it’s rarely a problem. I would say blindly adding braces strictly reduces the clarity of the code.

                        For every bug, there is some admonishment we can invent, but it quickly devolves to “don’t make mistakes”. If it’s not one thing, it’s another. This does cut both ways, I’m aware. People kept using strcpy because they were convinced they were smart enough to do it right. The difference is that strcpy buffer overflow were legion, and bizarro bonus goto bugs are blue moon events.

                        See also: base rate fallacy. How many billions of lines of code use unbraced if? How many security vulnerabilities can you name as a result?

                        1. 1

                          In my experience writing in C-like languages, I have not felt that wrapping one-liners up in {} consumed so much space it was problematic. If anything it has been a pleasant forcing function for refactoring code.

                          I’m arguing that there are actionable steps that can be taken that could help reduce bugs such as these, rather than just saying “don’t make mistakes”. That is a distinct argument than if it’s worth it. Maybe it’s not. A bug like the apple one might be rare but it’s also arguably very expensive (in some sense of the word). I’m also making a more general statement, not just about bracing. For example the tarsnap bug could possibly have been alleviated the author implemented the algorithm in a more imperative way, where the nonce update was distinct from using it. But maybe it wouldn’t have.

                          But I think we can at least learn a lesson and try something different than just shrugging and saying “people make mistakes”.

                          1. 1

                            Sure. I think a decent rule might be “screw up once, shame on you; screw up twice, shame on us.” I will change my stance about braces after the next catastrophic brace failure.

                            A better lesson to learn might be, does your code use ssl? It will be riddled with bugs and require constant security updates for life. But approximately nobody is actually moving away from ssl, even though that’d fix real security problems.

                            The problems we can fix and the problems we should fix are unfortunately rather disjoint sets.