1. 3
  1.  

  2. 4

    Agile will certainly not fix the bad code problem. Whatever the original intentions were behind the movement, these days it is deployed by managers to create anxiety and get “product” shipped faster. That’s going to have the opposite effect. It will produce more code but it won’t be better code.

    Writing sound or even half-decent software is careful and time-consuming and would get a programmer fired in the business-driven engineering shops that comprise most of the tech industry (even, if not especially, startups).

    It’s actually quite simple: economics. Business code is terrible because no one is willing to pay for good code. It is possible to write good code and there are institutions (most within governments or academia) that are doing it, but it’s not something that the typical software employer is going to pay for. The people making the decisions about how fast to expect “delivery” and how much they are willing to pay for code are not the people who get stuck having to maintain it, and most managers plan on being promoted away from messes before technical debt can affect their personal career fortunes– this isn’t hard to do, because usually when things go wrong in a major way, there are multiple components at fault and a good politicker can evade blame or fall back on “I said that it was just a prototype”.

    1. 2

      Agile will certainly not fix the bad code problem. Whatever the original intentions were behind the movement, these days it is deployed by managers to create anxiety and get “product” shipped faster. That’s going to have the opposite effect. It will produce more code but it won’t be better code.

      This is the opposite of my experience. Agile leads to less code, but more functionality. “Better” is what the business defines it to be; you’ll choose to produce something with a higher defect rate but more functionality if and only if that’s what you value.

      Writing sound or even half-decent software is careful and time-consuming and would get a programmer fired in the business-driven engineering shops that comprise most of the tech industry (even, if not especially, startups).

      Overengineering is the worst kind of bad code. This is the real lesson to take from experiences like http://yosefk.com/blog/why-bad-scientific-code-beats-code-following-best-practices.html . It’s much easier to fix code that was written with a casual disregard for edge cases than code that was made to conform to an architecture/pattern/what-have-you that doesn’t make any sense.

      It’s actually quite simple: economics. Business code is terrible because no one is willing to pay for good code. It is possible to write good code and there are institutions (most within governments or academia) that are doing it, but it’s not something that the typical software employer is going to pay for.

      Bingo. Have you considered the possibility that that is a rational and correct decision on the part of those businesses?

      The people making the decisions about how fast to expect “delivery” and how much they are willing to pay for code are not the people who get stuck having to maintain it, and most managers plan on being promoted away from messes before technical debt can affect their personal career fortunes– this isn’t hard to do, because usually when things go wrong in a major way, there are multiple components at fault and a good politicker can evade blame or fall back on “I said that it was just a prototype”.

      If you think businesses really are mistaken in these decisions, then figure out how to fix it at the business level - how to align managers' incentives in favour of making decisions that will reduce maintenance costs, for example. Making this conversation about the personal virtues of developers and managers is unproductive - ultimately people will find a way to respond to their incentives, like water running downhill, and even find a way to feel like they’re doing the right thing while doing so. http://yosefk.com/blog/people-can-read-their-managers-mind.html

      1. 3

        Have you considered the possibility that that is a rational and correct decision on the part of those businesses?

        I have. It would be a depressing conclusion, but it may not be the wrong one. Obviously, most businesses cannot afford the investment that would be required for the quality of code that you’d want for a space shuttle.

        I tend to think that I’m right in the long-term, however, in regarding the current favor for the cheapest and shittiest to be pernicious. Bad code is often the sort of thing that costs a company in the long run but appears harmless, and is certainly cheaper, up front.

        If you think businesses really are mistaken in these decisions, then figure out how to fix it at the business level - how to align managers' incentives in favour of making decisions that will reduce maintenance costs, for example.

        To be honest about it, I’m not sure that I can fix it– or that anyone can. Management is a classic principal-agent problem. We’ve known for a century that executives are a weak link, prone to self-dealing at the expense of businesses and employees both. Shareholders eventually fire executives if the numbers get really bad, but that’s often too late and the executives are usually smart enough (in a sleazy, self-interested sort of way) to bounce up rather than down.

        In short, these are sociological problems rather than technical problems, and possibly intractable ones.

        My guess, regarding the technology industry, is that we’re going to see a return to a pre-1985 constellation wherein serious programmers are mostly in research, with a very small in heavy industry working as security or ML specialists, while only the mediocre ones go into corporate programming (including startups). Silicon Valley and Wall Street changed that for a while, and not only in the ‘90s bubble. It started in the mid-1980s. You started seeing genuinely smart people go in to business programming, because the outlier compensation was immense, and even the standard compensation for strong programmers was higher than in the research world. People would leave their jobs at NASA or in defense or in academia for a 3-to-5 year tour of duty in Silicon Valley, and some would get rich and retire, some would get rich and become investors, and some would return to research (as most of them originally intended to do). Corporate programming was a sabbatical that might pay off big and, if it didn’t, you could always go back to your upper-middle-class job in academia or a government agency.

        Around 2005, we even saw a second-wave idealism (see: Paul Graham) of smart people who thought they could change the world of business programming. So, we ended up with the “10x” mythology as well as the culture where 24-year-olds go to conferences and talk about the CAP Theorem as if it were a deep result. (I don’t mean to denigrate that. It’s great that young people are going to conferences and developing those skills. I just wish we had a culture where it was acceptable to go a bit deeper.) The side effect was that corporate programming was no longer used as a sabbatical for people with an established credibility in something real; instead, we had a lot of young people piling into it thinking that they’d get rich quickly and be able to retire by 30 (which almost never happens).

        Now, in 2016, what seems to be the case is that we’re back to the old world. The corporate sector doesn’t need smart people– at least, not as programmers. (It still needs quants and actuaries and lawyers, and it needs different kinds of intelligence from marketers and executives.) It has figured out how to make us fungible. That’s not something to be angry about; that’s its job. But we’re idiots if we stick around competing with H1-Bs and boot-camp grads, working on braindead business problems, while wages decline over the next 10 years to regular-office-worker levels. So, I think that the grand conclusion of the 2010s is that the last three decades were an anomaly, and that business programming (which is what most of us here do) is going back to being for people of lesser talents.

        1. 2

          Ultimately economics abhors a niche with outsized compensation that anyone can enter; we should never expect them to stay around long.

          You might be right about the business side of things. Maybe businesses no longer need top talent, and programming can become yet another comfortable middle-class profession for mediocre middle-class people who went to the right schools and have the right credentials. (I think your professional association proposal will accelerate this, which is why I oppose it, but it’s probably inevitable either way).

          But I don’t think you can be right about the research side of things. It may be getting less easy to work on interesting problems and be well rewarded in industry, but we’re a hell of a lot better off than the people on the academic side. There are too many people who want to do research badly enough to accept poor pay and conditions for it, and simply not that many research positions. The upper-middle-class jobs in academia are going, and I don’t think they’ll come back.

          1. 5

            The upper-middle-class jobs in academia are going, and I don’t think they’ll come back.

            This may be true. Certainly, the past 20 years have not been good on that front.

            At least there, much of the damage is reversible, because the problem is political. We could, in theory, elect leaders in government who will support research and fund it. Unfortunately, the popular sentiment (in the US and in Europe) seems to be going the other way: right-wing jingoism seems to be on the rise, anti-intellectualism is in vogue, and state legislatures (in the US) went hard-right a long time ago. I lived in Madison in 2005-06. A lot of people are surprised that Wisconsin “went red” in 2016; I’m not. In the US and EU, we seem to be locked in a pattern where economic stagnation provokes right-wing politics and stupid nationalism and an aversion to the research efforts and cooperation that economic growth requires… resulting in more stagnation. I don’t know how we get out of it.

            There’s an argument that could be made, which is that the meltdown of academia and research would push good people into business programming, anyway, even though the outsized rewards are gone. (That could be useful from the business peoples' perspective– talent becomes cheaper.) There are two issues there. The first is that “good people” has more to do with skills and experience than innate talent; bitter 150-IQ people who haven’t worked on a real project for 20 years aren’t any more useful than low-talent people. So, if quality experience gets thin on the ground, everyone loses. The second is that, even if it were true, it wouldn’t be useful to society because that kind of talent is wasted in business support roles.

        2. 1

          It’s actually quite simple: economics. Business code is terrible because no one is willing to pay for good code. It is possible to write good code and there are institutions (most within governments or academia) that are doing it, but it’s not something that the typical software employer is going to pay for.

          Bingo. Have you considered the possibility that that is a rational and correct decision on the part of those businesses?

          It certainly might be but, FWIW, in my experience usually it is not a conscious and rational act. Maybe the outcome is the same, though.