1. 6
  1.  

  2. 6

    The infamous part of the Ken Thompson interview just leaves me scratching my head. Certainly there are lots of logic errors where a clever individual can exploit a program to do something it shouldn’t even in a memory safe program. But, I’m on the FreeBSD security errata mailing list, and there are still tons of posts on there that have to do with buffer overflows.

    Statements like this leave me baffled:

    I think that class is actually a minority of the problems. Certainly every time I’ve written one of these non-compare subroutine calls, strcpy and stuff like that, I know that I’m writing a bug. And I somehow take the economic decision of whether the bug is worth the extra arguments. Usually now I routinely write it out. But there’s a semantic problem that if you truncate a string and you use the truncated string are you getting into another problem. The bug is still there—it just hasn’t overflown the buffer.

    But…what if I told you that you could get the safety without an extra argument?? It might cost you a language that does not dynamic allocations, but is that worse than a bug you know you’re writing? You can always start there then go backwards on the places that matter to less dynamic allocations, but why start with bugs??

    This interview reminds me a bit of the Joyent post on Node.JS errors:

    https://www.joyent.com/node-js/production/design/errors

    The content of the post is a quite good breakdown of how to categorize and handle errors in Node. A big section of the discussion, though, is how to handle type errors. What if someone gives you null when it’s not allowed, or a number where you expected a string. How should you give them back a TypeError? But then if you step back a bit farther it feels like a document written in the dark ages. We know how to make type errors NOT a programmer’s problem. So we’ve got great advice on how to do something you shouldn’t have to do in the first place, just someone is stuck in tools out of the wrong century.

    And yeah, there is a whole lot of nuance to the whole thing so take this comment for what it is: a comment on the internet I wrote in 10 minutes.

    1. 2

      But…what if I told you that you could get the safety without an extra argument?? It might cost you a language that does not dynamic allocations, but is that worse than a bug you know you’re writing?

      If you just said “a GC you can disable,” I’d think you were quoting a letter from Niklaus Wirth to Ken Thompson. Wirth with Modula-2 and Oberons showed we could have an easy-to-compile, memory-safe-by-default language that could work on crappy hardware doing anything from OS’s to apps. He also via an unsafe keyword (“SYSTEM”) showed one could turn off safety to go for benefits of something like C in a specific module. Ada did the same thing although the GC’s were either unavailable or optional for it depending on the time period. Rust does something like this, too. All of those resulted in vastly fewer problems and debugging than C-based apps.

      So, Ken Thompson is beyond unconvincing. It’s the same attitude of not giving a shit that left UNIX security, reliability, and usability a mess for years.

      1. 1

        But…what if I told you that you could get the safety without an extra argument??

        I am 100% sure K. Thompson knows how to combine malloc and strcpyn to produce that magic result.

        1. 3

          If he thinks a bug is more economical than passing an extra parameter then I’m quite sure he doesn’t want to call malloc. But it’s not the 80’s anymore, we have languages that can safely manage memory for us now!

      2. 1

        Great write-up on these as always.

        “Speculation here, but I’m going to guess analog computers aren’t as helpful for writing digital programs. “

        Just his brain that’s mostly analog from what I can tell. Human brains are always my counter to analog not being as general-purpose or useful. We just haven’t done an implementation of general-purpose[-enough] analog yet. There’s people trying.

        http://binds.cs.umass.edu/anna_cp.html

        http://www.kip.uni-heidelberg.de/Veroeffentlichungen/download.php/4713/ps/1856.pdf

        “Also, get a better workstation, but don’t expect magic. Ridiculous to think there’s been 30 years of hardware progress since this was written and it’s arguable how much better things have really gotten. “

        With Worse is Better effect, they put 30 years of hardware progress into speeding up legacy and new software for mainframes, Windows, and UNIXen mainly. That amount of hardware R&D put into machines optimized for Genera LISP or Smalltalk would’ve likely had vastly different outcome given those languages and platforms were specifically designed to rapidly produce and iterate code. I think modern workstations still don’t give you the full capabilities of Genera in your apps. There’s workarounds but it astonishes me that ancient products still have something on the best of today.

        “ The sensible advice is to try to maintain separation between what something does and how it does it. “

        That’s the default use of formal methods going way back. It was also a requirement for security certification under TCSEC. An English description of the What people can understand, a formal description of it to catch problems such as ambiguity, a formal specification of the How for analysis, and source code for the How that visibly maps to the formal spec. Finally, design in a simple way for formal verification even if you’re not doing formal verification since the simplifications often caught spec errors. That was the start of what was required. :)

        “ Should you use formal methods? Yes, but make sure you’re benefiting from them. Don’t become a slave to the tool. “

        Exactly. Millions were lost many attempts to overuse them. Maybe not a waste for the early ones since they’re the reason we know to be careful now. Someone had to try it.