This is part of a broader issue: there are very few debugging courses, very few debugging books, and very little study of what kind of debugging techniques work and don’t work. You can broaden this out a bit: debugging is a specific case of science (in the sense of working rationally from empirical phenomena to knowledge about the underlying unknown causes of those phenomena) and science, in general, is poorly studied. You have Popper and Feyerabend and Yudkowsky so on, but a lot of working scientists don’t take any of them seriously, and there’s really no set of accepted beliefs about how this works; but this body of theory applies just as well to debugging as to, say, characterizing the metabolic pathways of fructose.
There are some books that have good chapters about debugging: The Practice of Programming (linked in my other comment) and Code Complete, for example. Diomidis Spinellis has written some papers (e.g. Differential debugging) and reviewed Matloff and Salzman’s book on debugging. John Regehr has reviewed four other debugging books, but I haven’t read any of them (I think I started How Programs Fail but didn’t make it far, and Debugging by David Agans looks friendly and welcoming), so I don’t have an opinion. George Pólya’s How to Solve It (about how to do math) is frequently cited in this context, as well as in the context of writing code in the first place.
Maybe part of the reason for this is that, like dance or being a good husband, debugging is principally an activity of transcending our own psychological limitations. A bug in code we wrote ourselves is a delusion, a mistaken belief, and to remove it we must systematically attack our own belief structure, testing it against reality and seeking the flaw; while finding a bug in other code involves a similar process of grokking the broken system, if not in fullness, enough that the author’s mistaken reasoning becomes apparent. Like the higher levels of chess, debugging is not rational; it is the art of finding weaknesses in the rationality of others, or of ourselves.
Faced with these terrible realities, it’s easier for a poor computer science professor (if he is not also a Zen master) to sink back into self-aggrandizing bromides about inherent talent.
That’s a good list. As a result of the post, people have been emailing me about how to teach this stuff, as a result of the post, and my method is, ironically, quite ad hoc. I’ll have to forward it on in addition to my scattered thoughts.
I hadn’t that about it that way, as an act of transcending our own psychological limitations. It clarifies a problem I’ve been thinking about, debugging my life. It used to be much easier for me to make changes that make me happier (e.g., tweaking things so that I procrastinate less). In retrospect, with that framing, it seems obvious to me why things have gotten harder. Of course I’m better at debugging other people’s lives than my own, and of course I find other people’s trivial advice to often by quite effective. At this age, I’ve already found all of the easy problems that I can rationally attack; all that’s left are irrational blind spots. Visa versa, too.
Late last night, I felt like a complete idiot when a conversation made it very obvious what I should be doing with my career/job situation right now. But of course, if the best course of action wasn’t in an irrational black hole in my limited mind, I would have figured out from the past six months of introspection.
Early in my career, I was working with a developer considerably more senior, and we were pair programming, debugging an algorithm implementation that didn’t work. I led the work, and when we were finished, the other developer turned to me and said something along the lines of “You know, you’re really good at debugging. That was impressive, I don’t think I could have done that.” Apart from being flattering, since I was just out of school and didn’t really think I had any particular talent, it got me thinking about debugging as a discipline.
This article is right, debugging is really important and strategies for it don’t really get taught. I think a lot of programming works this way, and most of the point of a formal education is giving you the right meta-knowledge: teaching how one goes about learning new things about computers, how you should approach problems, cross-cutting philosophies that won’t get outdated with some new language or framework or discipline. But debugging has enough “implementation-independent meat” in it that some of these things really should be taught.
Ostensibly, the closest class I had in college to something like this was named “Software Engineering,” which I think is a bullshit class everywhere, but at my school it definitely was. It was the where you get in groups of five lazy or stupid people that aren’t going to learn anything and one person that’s going to drag everyone across the finish line and only learn to hate and mistrust their coworkers, and you get to implement half of some crappy website and a powerpoint presentation about it. Maybe you pick up some UML along the way. That’s not what this class should be about.
What needs to be taught is not specifically debugging (as in, not with gdb or eclipse, anyway). What needs to be taught is how to re-evaluate your assumptions. Most programming courses are constructive: you start with a problem and a blank sheet of paper, and you build a working solution from scratch. Rarely do you start with something already built, but working improperly, and try to fix what’s already there. It’s strange that this isn’t taught more in schools, because it’s most of what we do in industry. I think the closest you’d get to this in a class would be in crypto courses, because there, even though you may be trying to break something rather than fix it, at least you’re working with an already existing system that you need to evaluate.
Basically, “software engineering” classes should break from the rest of the disciplines that teach you to build new things, and instead it should be about taking something that already exists and fixing or otherwise modifying it. Experience doing this, and strategies for dealing with extant systems, are really all you need to be better at debugging. Sure, there are tricks you can play with gdb, and there are things you do when writing code to make debugging easier, but the most important thing is this concept, which really doesn’t get addressed by other courses. Probably the best implementation of a course like this would be contributing something to an open source project. Google Summer of Code is down the right path here, we just need to find a way to scale it out to be more broadly accessible.
should be about taking something that already exists and fixing or otherwise modifying it
It’s funny that you say this, because that’s exactly what we did in my Operating Systems class in school. Our assignments had us make modifications to the Linux kernel, such as writing our own version of the process scheduler. I learned a lot in this class. Not just about operating systems, but also about software engineering practice. How to grok and then modify a large existing code base. Not coincidentally, this is also the class that made me get really really good at debugging.
Yeah, my experience was similar. My most useful-for-debugging classes were an OS class where we wrote kernel code, and a UNIX/C class where we focused on fixing very old versions of software to run portably on a large set of modern unices. Both taught by the same really fantastic professor.
The crypto example is there because it’s where I expected most people to have this kind of experience. But you’re totally right.
Many times a recommended practice (e.g. using a debugger) doesn’t appear significantly useful until working on a sufficiently complex project. When you can hold the entire state of the program in your head, print is sufficient. The problem is when students enter the workplace habits used for school projects no longer scales up to industry level.
Also when students are in firefighting mode they don’t spend time to learn habits that will pay off in the long run. In OS when classmates had problems with their code, I tried to teach them on how to inspect memory locations or set breakpoints with gdb so they can figure out what’s going on. However most of the time I was rebuffed. Maybe it’s because gdb is particularly arcane looking for Eclipse / Java students, but these students just limped along with printf() statements littered everywhere in their codebase.
Debuggers are controversial and may or may not be better than printf() statements, depending on your environment. Consider Tim Bray’s When the going gets tough, the tough use “print” or Kernighan and Pike’s TPOP chapter 5 on debugging. I assume you aren’t going to tell me that you think gdb is too arcane-looking for Brian Kernighan.
What this article is talking about are the mental techniques of debugging; Dan’s example is a combinational-logic circuit in which you can use progressive-refinement debugging in a very simple way. He’s not talking about learning how to use debugging tools.
Exactly, systematic debugging is indispensable when tackling complex projects, whether those are hardware or software. The method I use is to trace the flow of execution (for software) or the propagation of a signal change (in hardware) and narrow down the point at which the actual state diverges from the expected state. This is not really taught in schools, so a lot of students resort to the debugging method of “change something and see if it fixes things”.
To be honest, it surprises me that this isn’t more intuitive to people. I was never explicitly taught how to systematically debug either, it just seemed the most logical thing to do when my code/circuit wasn’t working.
This is part of a broader issue: there are very few debugging courses, very few debugging books, and very little study of what kind of debugging techniques work and don’t work. You can broaden this out a bit: debugging is a specific case of science (in the sense of working rationally from empirical phenomena to knowledge about the underlying unknown causes of those phenomena) and science, in general, is poorly studied. You have Popper and Feyerabend and Yudkowsky so on, but a lot of working scientists don’t take any of them seriously, and there’s really no set of accepted beliefs about how this works; but this body of theory applies just as well to debugging as to, say, characterizing the metabolic pathways of fructose.
There are some books that have good chapters about debugging: The Practice of Programming (linked in my other comment) and Code Complete, for example. Diomidis Spinellis has written some papers (e.g. Differential debugging) and reviewed Matloff and Salzman’s book on debugging. John Regehr has reviewed four other debugging books, but I haven’t read any of them (I think I started How Programs Fail but didn’t make it far, and Debugging by David Agans looks friendly and welcoming), so I don’t have an opinion. George Pólya’s How to Solve It (about how to do math) is frequently cited in this context, as well as in the context of writing code in the first place.
Maybe part of the reason for this is that, like dance or being a good husband, debugging is principally an activity of transcending our own psychological limitations. A bug in code we wrote ourselves is a delusion, a mistaken belief, and to remove it we must systematically attack our own belief structure, testing it against reality and seeking the flaw; while finding a bug in other code involves a similar process of grokking the broken system, if not in fullness, enough that the author’s mistaken reasoning becomes apparent. Like the higher levels of chess, debugging is not rational; it is the art of finding weaknesses in the rationality of others, or of ourselves.
Faced with these terrible realities, it’s easier for a poor computer science professor (if he is not also a Zen master) to sink back into self-aggrandizing bromides about inherent talent.
That’s a good list. As a result of the post, people have been emailing me about how to teach this stuff, as a result of the post, and my method is, ironically, quite ad hoc. I’ll have to forward it on in addition to my scattered thoughts.
I hadn’t that about it that way, as an act of transcending our own psychological limitations. It clarifies a problem I’ve been thinking about, debugging my life. It used to be much easier for me to make changes that make me happier (e.g., tweaking things so that I procrastinate less). In retrospect, with that framing, it seems obvious to me why things have gotten harder. Of course I’m better at debugging other people’s lives than my own, and of course I find other people’s trivial advice to often by quite effective. At this age, I’ve already found all of the easy problems that I can rationally attack; all that’s left are irrational blind spots. Visa versa, too.
Late last night, I felt like a complete idiot when a conversation made it very obvious what I should be doing with my career/job situation right now. But of course, if the best course of action wasn’t in an irrational black hole in my limited mind, I would have figured out from the past six months of introspection.
That sounds interesting! What are you going to do with your career?
Early in my career, I was working with a developer considerably more senior, and we were pair programming, debugging an algorithm implementation that didn’t work. I led the work, and when we were finished, the other developer turned to me and said something along the lines of “You know, you’re really good at debugging. That was impressive, I don’t think I could have done that.” Apart from being flattering, since I was just out of school and didn’t really think I had any particular talent, it got me thinking about debugging as a discipline.
This article is right, debugging is really important and strategies for it don’t really get taught. I think a lot of programming works this way, and most of the point of a formal education is giving you the right meta-knowledge: teaching how one goes about learning new things about computers, how you should approach problems, cross-cutting philosophies that won’t get outdated with some new language or framework or discipline. But debugging has enough “implementation-independent meat” in it that some of these things really should be taught.
Ostensibly, the closest class I had in college to something like this was named “Software Engineering,” which I think is a bullshit class everywhere, but at my school it definitely was. It was the where you get in groups of five lazy or stupid people that aren’t going to learn anything and one person that’s going to drag everyone across the finish line and only learn to hate and mistrust their coworkers, and you get to implement half of some crappy website and a powerpoint presentation about it. Maybe you pick up some UML along the way. That’s not what this class should be about.
What needs to be taught is not specifically debugging (as in, not with gdb or eclipse, anyway). What needs to be taught is how to re-evaluate your assumptions. Most programming courses are constructive: you start with a problem and a blank sheet of paper, and you build a working solution from scratch. Rarely do you start with something already built, but working improperly, and try to fix what’s already there. It’s strange that this isn’t taught more in schools, because it’s most of what we do in industry. I think the closest you’d get to this in a class would be in crypto courses, because there, even though you may be trying to break something rather than fix it, at least you’re working with an already existing system that you need to evaluate.
Basically, “software engineering” classes should break from the rest of the disciplines that teach you to build new things, and instead it should be about taking something that already exists and fixing or otherwise modifying it. Experience doing this, and strategies for dealing with extant systems, are really all you need to be better at debugging. Sure, there are tricks you can play with gdb, and there are things you do when writing code to make debugging easier, but the most important thing is this concept, which really doesn’t get addressed by other courses. Probably the best implementation of a course like this would be contributing something to an open source project. Google Summer of Code is down the right path here, we just need to find a way to scale it out to be more broadly accessible.
It’s funny that you say this, because that’s exactly what we did in my Operating Systems class in school. Our assignments had us make modifications to the Linux kernel, such as writing our own version of the process scheduler. I learned a lot in this class. Not just about operating systems, but also about software engineering practice. How to grok and then modify a large existing code base. Not coincidentally, this is also the class that made me get really really good at debugging.
Yeah, my experience was similar. My most useful-for-debugging classes were an OS class where we wrote kernel code, and a UNIX/C class where we focused on fixing very old versions of software to run portably on a large set of modern unices. Both taught by the same really fantastic professor.
The crypto example is there because it’s where I expected most people to have this kind of experience. But you’re totally right.
Many times a recommended practice (e.g. using a debugger) doesn’t appear significantly useful until working on a sufficiently complex project. When you can hold the entire state of the program in your head,
print
is sufficient. The problem is when students enter the workplace habits used for school projects no longer scales up to industry level.Also when students are in firefighting mode they don’t spend time to learn habits that will pay off in the long run. In OS when classmates had problems with their code, I tried to teach them on how to inspect memory locations or set breakpoints with
gdb
so they can figure out what’s going on. However most of the time I was rebuffed. Maybe it’s becausegdb
is particularly arcane looking for Eclipse / Java students, but these students just limped along withprintf()
statements littered everywhere in their codebase.Debuggers are controversial and may or may not be better than
printf()
statements, depending on your environment. Consider Tim Bray’s When the going gets tough, the tough use “print” or Kernighan and Pike’s TPOP chapter 5 on debugging. I assume you aren’t going to tell me that you thinkgdb
is too arcane-looking for Brian Kernighan.What this article is talking about are the mental techniques of debugging; Dan’s example is a combinational-logic circuit in which you can use progressive-refinement debugging in a very simple way. He’s not talking about learning how to use debugging tools.
Exactly, systematic debugging is indispensable when tackling complex projects, whether those are hardware or software. The method I use is to trace the flow of execution (for software) or the propagation of a signal change (in hardware) and narrow down the point at which the actual state diverges from the expected state. This is not really taught in schools, so a lot of students resort to the debugging method of “change something and see if it fixes things”.
To be honest, it surprises me that this isn’t more intuitive to people. I was never explicitly taught how to systematically debug either, it just seemed the most logical thing to do when my code/circuit wasn’t working.