1. 16
  1.  

  2. 19

    I do not agree with the last point. Debuggers are useful and use of one does not point to poor code quality.

    1. 5

      I agree. I don’t see how using a debugger is any different from one of the suggested alternatives:

      Use print statements

      If I see code littered with echo or printf or logging statements, it says to me that you’re not sure what the code is doing there and you want some insight.

        1. 2

          If I see code littered with echo or printf or logging statements, it says to me that you’re not sure what the code is doing there and you want some insight.

          I think this only holds if you know all the ways in which the code is driven from the outside. Many of the bugs I find, at least in my code, are the result of emergent behavior I didn’t completely foresee. They are bugs in their own right, sure, but they tend to be manifest themselves only when my code runs in the scpoe of a larger system.

          In such a case I find it immensely useful to have prints, logs or traces all over the place so that I can see how the system is driven into an error state. Having the prints can also mean you’re not sure how your code is being used.

        2. 5

          95% of the time I have to use a debugger, it is because the code is not written in a simple and obvious way.

          The other 5% of the time it’s because “some other weird thing is going on.”

          Having to use a debugger to understand what code is doing is absolutely a sign of a poor code. Good code can be read on its own and understood with a very high degree of confidence.

          Of course, as a programmer working with other people’s code and legacy systems, knowing how to use a debugger is important. But it still usually means you’re working with code that is overly complex.

          1. 2

            You can also completely understand how a piece of code works, but don’t understand why it exhibits certain behavior. This happens when your code is used in a big opaque system, and it happened to me a lot. For example:

            class Person {
                public Person(string name, int age) {
                    if (name == "" || name == null || age < 0 || age > 150)
                        throw new Exception("Invalid person.");
                }
                
                ...
            }
            

            If the exception is thrown, it is obviously the caller’s fault. But if this exception shows up, it’s not immediately clear where something is going wrong. You would need to read through the whole code to see what’s happening.

            Especially in languages with pointers or references, it is possible that a reference/pointer is copied somewhere in an unrelated function, and aliases with data you use. This way, data can be overwritten when you don’t expect it to be. I usually debug this by stepping through the code and watching the data.

            … and yes, it does mean that the code is not perfect. But hey, mistakes do happen.

            1. 1

              The problematic code in this case – from the perspective of my original comment – is the system code.

          2. 2

            Debuggers are symptomatic of not understanding the code. Agree or disagree?

            1. 11

              Oscilloscopes are symptomatic of not understanding the circuit.

              1. 2

                Yes. If you understood what they were doing, you wouldn’t need instrumentation.

              2. 4

                Debugging is symptomatic of not understanding the code.

                I think Hintjens is saying your (understanding of the) code & recent change should be so clear you don’t need a debugger, while c-jm is saying debuggers can be useful even when your code is clear and your changes were small. Both are obviously true.

                1. 2

                  I think they are orthogonal really, code can be clean and hard to debug all at the same time.

                  Elements of “clean” code can also lead to making debugging harder. The more functions you have to do smaller things while more easily readable always introduces context shifts etc.

                  As with everything there are costs and benefits to every decision, all my point was is that in terms of a code quality the need to debug is not a good metric.

                2. 1

                  What’s “the code” though. Abstractions layered upon abstractions. Finding bugs without debugging in legacy code or multi-threaded code is almost impossible without debugging.

                  1. 2

                    I don’t dispute that.

                    Yet if you indeed are able to understand your code without a debugger, that’s a very good thing.

                3. 1

                  Often you need to debug code somebody else wrote.

                  1. 1

                    “Need” is the key part. Debuggers are super useful, but if I require them to do my job, then something is wrong.

                    1. 1

                      In my mind, debuggers are a great tool to validate and to enhance your understanding of how the code actually works. I have sympathy with the thought that it should not be necessary to run your code with a debugger to understand it. What I find weird is that the author puts this in a list together with “If your language offers a REPL that’s cool. “ which is for me in the same category of exploratory tool.

                    2. 3

                      The last sentence of item 7 is dubious. The right language for the right job can make a lot of difference. It shapes the way you think by making you apply certain concepts which can be great or not-so-great for certain tasks. I think there’s a pretty thick line between tribalism / fanboyism / evangelism and knowing which language is better for which job, these are not mutually exclusive.

                      1. 1

                        But “the right language for the right job” is a long way from what I suspect he experienced as tribalism, which was likely only ever using one language, not matter the situation. I think this is less prevalent than it used to be, but it’s definitely a thing.

                        1. 1

                          Definitely still is a thing, but it’s a far cry from “The language doesn’t matter”. It’s not as if there’s nothing inbetween tribalism and acknowledging that the language matters to a certain degree. Only a Sith deals in absolutes.

                        2. 1

                          For every 10 times I hear that the specific language makes a difference, probably 8 or 9 it’s tribalism. I’ve also been scarred by lots of enterprise mandates trying to limit and stagnate languages.

                          Whenever I hear this, I try to check the commit history of the person. If they have a mix of languages used, then I worry less. If they have a sustained history of only using a single language or approach, not even toy projects or minor fixes, then I strongly suspect tribalism.

                        3. 1

                          #7 is weird - maybe it makes your a more productive programmer but working in a community you value and enjoy makes you a better programmer.

                          Sure I can hack in C and the program will be more efficient but instead I can program in something I’m passionate about which aligns with my beliefs in Python and have much greater contribution. Dare I say - makes me a better programmer.

                          1. 0

                            I know, it’s a listicle; but items 1 through 3 will amaze you I haven’t seen before in other such lists. (Ere I be accused of clickbaiting: they are 1. If it works and is still useful, don’t throw it out; 2. Never solve the same problem twice in parallel; and 3. Solve the same problem often in serial.)