1. 12
    1. 9

      I mean, you still need the intuition to place printfs in the right place, in the same way you need the intuition to place breakpoints or step appropriately. Sometimes having worse tools doesn’t make you smarter, it just makes you less productive.

    2. 4

      It’s also… I’m currently working on a project that uses a mashup of nine languages (eleven, if you wanna split out subset langs like JSON from JavaScript etc) and a bunch of processes talking to each other over the network. The traditional process/memory debugger wasn’t meant for that. Printf still work.

    3. 4

      So why do a lot of the best software engineers not use a debugger?

      Where does this assertion come from? I have seen no evidence to this in my years, only anecdotes from those quoting the Unix folks of old (or trying to emulate them).

      Further to that, let’s consider Kernighan’s statement, which seems to show up every time this kind discussion gets going.

      The most effective debugging tool is still careful thought, coupled with judiciously placed print statements.

      One can be even more effective with careful thought and a debugger. It’s too bad that the “using a debugger” seems to be synonymous with “thoughtless fixing”, as characterized by the post. I’ve seen beginning programmers use printf statements in a very unthinking way, just I’ve seen them do it with debuggers. It’s just that with debuggers, they tend to reach a solution faster. In both cases, the solution is usually bad.

      More appeal to authority quoting in the article.

      I sit back and run through the code in my head and think about ‘how could this have happened’.

      Again, you can do this with a debugger. I happen to do it all the time. Thinking and use of a debugger are orthogonal. These continued pontifications about how “real programmers don’t use debuggers” suggests to me that those saying it are only aiming for more programmer machismo points and not dispensing good advice.

      1. 4

        I have seen no evidence to this in my years, only anecdotes from those quoting the Unix folks of old (or trying to emulate them).

        I mean, the fact that Unix doesn’t have a good debugger doesn’t help matters any. (Though Linux is notably worse than most because it doesn’t even have a kernel debugger…) gdb is a hateful experience compared to anything on Windows.

        From the Unix Haters’ book (PDF page 224):

        There are two schools of debugging thought. One is the “debugger as physician” school, which was popularized in early ITS and Lisp systems. In these environments, the debugger is always present in the running program and when the program crashes, the debugger/physician can diagnose the problem and make the program well again. Unix follows the older “debugging as autopsy” model. In Unix, a broken program dies, leaving a core file, that is like a dead body in more ways than one. A Unix debugger then comes along and determines the cause of death. Interestingly enough, Unix programs tend to die from curable diseases, accidents, and negligence, just as people do.

    4. 2

      I subscribe to an education philosophy that is different from both of the ones described in the article. The conventional education system is too much a production line designed to churn out industrial revolution factory workers and select a few for more technical roles. It desperately needs an overhaul.

      “Do everything the hard way, it builds character” however is not a solution, that is the system we had in the bad old days before the industrial revolution. Back when major advances in manufacturing were separated by centuries instead of the weeks or months we have now. The main issue being it excludes all but the highest performing individuals. If there is one thing that is evident from the history of science and technology, it is that advances can come from anyone.

      We are on the cusp of a whole new educational paradigm so let’s not go backwards. My view is that self direction, combined with maximising access to information and tooling of all kinds, is the way forward. To a certain degree this agrees with Ramanujan’s experience in that his learning was self directed. I find it hard to reconcile the idea that it was not this, but his lack of information and tools that made him a genius.

      As to applying this to software development - ensure that people are learning by developing software they care about and are interested in. Then by all means give them the best IDE and debugger, and also the best documentation and examples, that we can possibly muster. If we are lucky one of them might even come up with a better debugger and make all of our lives easier.

      1. 3

        Right. There must’ve been thousands of people with access to libraries where that particular book was available but that only gave us a single Ramanujan. That’s not to glorify the rare genius but to say we need to figure out better way to teach both knowledge and insight.

    5. 2

      I think one important point is that a debugger will only ever help you to find symptoms of a bug, not the bug itself. To find the bug and ultimately fix it, thinking through the code is always necessary. I believe that this is the meaning behind that Kernighan quote in the article.

    6. 1

      Should you use a debugger? I expect this is a horses for courses question, depending on the person and the environment. I certainly find the js debugger in my browser useful, not least because I have little idea what I’m doing.

      When I’m running go code on a kube cluster waiting for the problem to reproduce, copious logging gives me the insight I need, but if there were a good way to have a good debugger attached that I didn’t need to select and learn, I might like that better.

      What I would say is that overall our tooling isn’t good enough that you should be unable to use logger debugging.

    7. 1

      The kernel debugger kgdb, hypervisors like QEMU or JTAG-based hardware interfaces allow to debug the Linux kernel and its modules during runtime using gdb