I like the list as a whole but disagree that Technical Debt needs to be retired.
My particular industry is full of code that doesn’t work well, is not easily debugged, and is not easily extended. And it is often requested to tape over issues in bad code instead of doing a rewrite of the egregiously bad parts.
Technical Debt is a great way to describe to product owners that this particular fix might be faster than a rewrite, but over the course of many fixes, using the bad code will be more costly over fixing the bad code. Incurring debt for a quick fix is a trade off. Many times it is the correct trade off, but the term “Technical Debt” describes the trade off well.
Something I only recently came to realize about the term. When used outside of engineering circles, which is increasingly common, folks from the business side of the company hear it differently than engineers do. Engineers often mean it as “quick and dirty stuff we never got time to fix” or “stuff that used to be OK but has dependency rot.”
What the business side hears is “bad decisions tech people made that they want me to pay for”.
So I’ve definitely moved away from using that term.
It seems likely that this isn’t the best-possible metaphor, but I’m also a little leery of framing it like in-group jargon that is accidentally escaping.
Technical debt has always struck me as a bald (and maybe even a little condescending?) attempt to frame this cluster of problems in terms the business side can (hopefully) grok.
Presumably the business people can understand the need to pivot product, fire an employee, or kill a project when they can’t get traction? But they can’t grok that tech decisions are just as prone to fundamental flaws (that usually weren’t obvious when they were made) or to age poorly as conditions change?
In product, talent, and tech–the time always comes to take stock of which way the wind is blowing and do what you can to put it at your back. If they don’t get this, it sounds like they’re expecting magic instead of a blob of tactical and strategic decisions made over time?
In financial business, debt is a useful tool, deliberately taken on for a variety of reasons. In some cases, you can afford to pay it off at any time, but you choose not to because the debt isn’t necessarily a bad thing.
What bugs me about “technical debt” is I’m pretty sure the original meaning was similar to that, yet a lot of times it is just used to describe bad code, even when that bad code isn’t actually buying you anything.
Previously something very specific used in the context of financial technology development. Now means whatever anybody needs it to mean if they want their product owner to let them do some hobbyist programming on their line-of-business software, or else. Can definitely be retired.
Weird take, as if from the leeriest of PMs though I know the author is not.
I honestly cannot understand how anyone who has worked in industry on legacy codebases could view concerns about improving code maintainability, reducing bugs, and generally making code faster to work as “hobbyist programming.” In my experience the types of things generally classified as “technical debt” are the high-order bit in improving all the things PMs generally complain about.
The point is that many programmers call something “technical debt” when it’s not really: they’re waving hands and making excuses to do something fun.
I very recently heard “This is written in Objective-C, so it’s technical debt. We should re-write it in Swift.” The code in question had low churn, great test coverage, and few defects. It was probably the best of the entire project.
The author’s position is like saying, “Let’s retire the word ‘sick’. It now means whatever someone wants it to mean when they need a free day off work.”
I mean, everything legitimate can be, and sometimes is, abused. Indeed, the excuse only has force because of the underlying reality.
In my own experience the phenomenon of PM’s, or programmers for that matter, avoiding technical debt when it needs to be addressed is far more common than the phenomenon of programmer’s abusing the idea to do whatever they want.
Can I offer “serverless”? I’m clearly far too long in the tooth since I’ve never understood how server-side apps could be serverless, but then maybe it’s just me?
Serverless means “you don’t need to manage servers”, not “servers don’t exist”. Given the absurdity of the latter premise, I would think the former premise would be pretty obvious, but clearly I’m mistaken since it’s such a commonly cited point of confusion.
Serverless means “you don’t need to manage servers”, not “servers don’t
exist”. Given the absurdity of the latter premise, I would think the
former premise would be pretty obvious
Well, a lot of computing-involved folk lean toward literal-mindedness.
I hear the term “serverless”, and my brain flags it as a logical
contradiction. Serverless? That does not compute. I feel like that
Nomad probe in the original Star Trek episode The Changeling. “Non
sequitur, non sequitur, your facts are in error”. And as people go, I’m
not usually all that literal minded.
Serverless sounds like a term cooked up by business people to sell yet
another cloud thing.
If you use things like AWS Lambda, Google AppEngine, or even the venerable Heroku, you’re deploying your application “serverless” because while yes, SURE, there are servers churning along behind the curtain, you don’t get to see them, and in fact you don’t get to know they exist.
You upload your code and it starts magically running without any intervention on your part.
It’s about the abstraction presented to the deveoper, not the back-end implementation.
Luckily the industry doesn’t really use this term any more so we can ignore the changed meaning. The small club of people who still care can use it correctly, everybody else can carry on not using it. Just be aware when diving through the history books that it might mean “extreme late binding of all things” or it might mean “modules, but using the word class” depending on the age of the text.
I’m not sure if this is a joke or not, but people still use OOP all the time and there’s no consensus whatsoever as to its meaning (although curiously just about everyone who uses one of the meaning seems to insist that everyone agrees with their definition, even though in a given thread multiple definitions will be espoused). Moreover, I’ve very rarely heard people define it in terms of binding time, but usually it’s one of “sending messages”, “encapsulation”, “inheritance/classes”, or so on. There are still plenty of outspoken OOP devotees, and the only thing they seem to agree on is an affinity for the name. The lack of clear consensus about the name makes for some incredibly unproductive conversations.
I’m sure it’s a joke, but with some truth to it, in that we’re past peak OOP hype. Nonscientific but the ol’ ngram search matches my intuition pretty well that it was nigh-inescapable in the 90s, when it nearly always just meant “like C++ “ (and later Java into the 00s).
(Final edit: granted this may be related to my own immaturity and poor book selection as I was learning programming from the mid 90s to mid 00s.)
“sending messages” is late binding, because the object will do whatever it does when I send the message, and so whatever you get passed at runtime determines what you call instead of some compile time or link time thing.
How about we retire “waterfall”? If you don’t work for NASA, you’re probably not doing waterfall. If you’re not producing six design/spec documents per program, you’re not doing waterfall. Having a design is not waterfall. Having a design is not evil. Waterfall is not evil (it just suits a set of requirements that you probably don’t have). I wholeheartedly endorse “bastardized” workflows over intellectually-pure ones; it probably means that someone has made some concessions to the people who have to actually get the job done.
It’s a fun article. It’s always pleasant to take a shrimp fork to things that annoy us, but, let me use one of my own pet peeves as an example: “DevOps” only REALLY has any meaning as the idea put forth by Gene Kim et al in The Phoenix Project about breaking down silos etc.
The problem is, how does one properly refer to the much needed specialty in our industry involving infrastructure as code, deployments, and large parts of what used to be called release engineering?
Sometimes over and mis-used words itch because people abuse them but until and unless we find better labels for some of the things they point to we can’t jettison them IMO.
But back to our fun game of snarking at the industry in all its silly grandeur, can we toss “dead” onto the funeral pyre?
I’m so sick of hearing about Java/Perl/Fortran etc. being “dead” I could have a personal protein spill :)
Eh, there definitely are engineers that bridge the gap between product development, and product operations. While ideally every developer would think about the operational impact when developing features, not everyone can, and that’s fine as long as there are some that do think about that. Similarly, every operator would think about the impact for the developers when making an operational change, but not every one can, and once again, that’s fine as long as someone in operations can do it. Some people do both development and operations, and I think that they bring a lot of value for the team.
Also, their shrimp-forking of “Agile” is one I can’t buy into either.
Sure, large swaths of the industry are throwing this term around without any care at all as to what it means, but despite all that, The Agile Manifesto still exists and has gravitas, and teams are using its principles as well as some of the tropes that have been built upon it to increase their velocity as well as the value they provide their stakeholders.
I’m old enough to remember the industry before it, and despite all the mis-use and abuse of the term, I’d hate like heck to give up sprints, stand-up and the spirit of nimble process that goes along with it.
I think this list is pretty bad and I’m pretty tempted to flag it as spam because low effort complaints about these terms appear so frequently.
[Artificial Intelligence] now means “an algorithm pushed to production by a programmer who doesn’t understand it”
This is dumb. If we’re not talking about Machine Learning, then you can usually understand how your AI makes decisions. Games are great examples of AI that is easy enough to reason about that you could trace the causes of every decision your AI makes.
If we are talking about ML, “doesn’t understand it” is phrasing I’d like to see retired. We know how these things work. You might not be able to trace the cause of every decision an ML algorithm takes, but they are not magic black boxes. There are plenty of non-DL algorithms that you can write by hand in a little bit of Java/Go/your-favorite-lang that can exhibit pretty complex behavior (kNN is an example). Sometimes I think people forget that “buzzwords” have buzz for a reason.
I agree that it’s bad, and I don’t think it should be here, but I’ve been told off for using the spam flag for that. My tactic in lieu of the “low quality” flag reason is to upvote everything else currently on the front page.
I’d love to kill the term “hack”. Most the time when people say something is a hack, all they mean they dislike it for some personal reasons. It is an empty, dismissive insult.
I’d love to keep being able to call awful workarounds and unsystematic solutions with a four-letter word. And as far as I can tell, most of the time people use it to label their own works, and for good reasons.
I agree. I’d add Deep Learning and Blockchain to the list.
Deep Learning also means too many things these days and quite often just Artificial Neuronal Networks, or is synonymous with Artificial Intelligence, which is mentioned in the article.
Blockchain is often meaning something different, like Proof of Work or weirdly enough sometimes central authority (for example in some cases of private blockchains), but most importantly Hash Chains, Merkle Trees, etc.
I’d argue that sometimes words end up in such a tainted situation on purpose, through propaganda/marketing, both to add a positive or negative emotion, and to hinder productive discussion. See also Globalism vs Internationalism and especially for the first, where depending on where you come from it means Internet, easy travel, easy business, working together on research, etc. or exploitation of developing countries and their population, neo-colonialism, exploitation of natural resources for short-term profits, wage dumping, tax and law evasion, etc.
An example of marketing seems to be around VM, vServer and compute instance or (cloud) instance.
The term Crypto seems to partly suffer from a similar issue, though in that case context tends to make things clear.
I think it’s a relatively new phenomenon in IT though and one that I think would be important to counter-act, to make sure it remains easy to have meaningful discussions. I am not sure whether retiring is the right approach here, because it might lead to a situation where you can’t read any book or documentation a decade old and might cause generational divides in companies and communities.
Maybe some normative instance would help here? I’d say Wikipedia, but it’s not normative, but tends to show multiple meanings, sometimes different articles using words with different meaning, and in some cases even linking to an article not describing the implied meaning.
See also Globalism vs Internationalism and especially for the first, where depending on where you come from it means Internet, easy travel, easy business, working together on research, etc. or exploitation of developing countries and their population, neo-colonialism, exploitation of natural resources for short-term profits, wage dumping, tax and law evasion, etc.
“Globalism” takes a much darker turn than you have mentioned. If you don’t know what I mean, ask anyone with three sets of parens around their Twitter username.
I thought the people with the triple parens around their username were Jewish? I thought the whole 3 parens thing started with antisemites imply that someone or something was Jewish or controlled by Jews or similar, and then Jewish people started using it to announce that they aren’t hiding from the antisemites (or something like that)?
I didn’t want to talk about globalism though, but to bring a short example. While both your response and what you write about seem to be a good perfect example how it is an emotional term I think it might lead to very off-topic discussions.
I like the list as a whole but disagree that Technical Debt needs to be retired.
My particular industry is full of code that doesn’t work well, is not easily debugged, and is not easily extended. And it is often requested to tape over issues in bad code instead of doing a rewrite of the egregiously bad parts.
Technical Debt is a great way to describe to product owners that this particular fix might be faster than a rewrite, but over the course of many fixes, using the bad code will be more costly over fixing the bad code. Incurring debt for a quick fix is a trade off. Many times it is the correct trade off, but the term “Technical Debt” describes the trade off well.
Something I only recently came to realize about the term. When used outside of engineering circles, which is increasingly common, folks from the business side of the company hear it differently than engineers do. Engineers often mean it as “quick and dirty stuff we never got time to fix” or “stuff that used to be OK but has dependency rot.”
What the business side hears is “bad decisions tech people made that they want me to pay for”.
So I’ve definitely moved away from using that term.
How do you frame the problem instead?
It seems likely that this isn’t the best-possible metaphor, but I’m also a little leery of framing it like in-group jargon that is accidentally escaping.
Technical debt has always struck me as a bald (and maybe even a little condescending?) attempt to frame this cluster of problems in terms the business side can (hopefully) grok.
Presumably the business people can understand the need to pivot product, fire an employee, or kill a project when they can’t get traction? But they can’t grok that tech decisions are just as prone to fundamental flaws (that usually weren’t obvious when they were made) or to age poorly as conditions change?
In product, talent, and tech–the time always comes to take stock of which way the wind is blowing and do what you can to put it at your back. If they don’t get this, it sounds like they’re expecting magic instead of a blob of tactical and strategic decisions made over time?
In financial business, debt is a useful tool, deliberately taken on for a variety of reasons. In some cases, you can afford to pay it off at any time, but you choose not to because the debt isn’t necessarily a bad thing.
What bugs me about “technical debt” is I’m pretty sure the original meaning was similar to that, yet a lot of times it is just used to describe bad code, even when that bad code isn’t actually buying you anything.
The quote from the article:
Weird take, as if from the leeriest of PMs though I know the author is not.
I honestly cannot understand how anyone who has worked in industry on legacy codebases could view concerns about improving code maintainability, reducing bugs, and generally making code faster to work as “hobbyist programming.” In my experience the types of things generally classified as “technical debt” are the high-order bit in improving all the things PMs generally complain about.
The point is that many programmers call something “technical debt” when it’s not really: they’re waving hands and making excuses to do something fun.
I very recently heard “This is written in Objective-C, so it’s technical debt. We should re-write it in Swift.” The code in question had low churn, great test coverage, and few defects. It was probably the best of the entire project.
The author’s position is like saying, “Let’s retire the word ‘sick’. It now means whatever someone wants it to mean when they need a free day off work.”
I mean, everything legitimate can be, and sometimes is, abused. Indeed, the excuse only has force because of the underlying reality.
In my own experience the phenomenon of PM’s, or programmers for that matter, avoiding technical debt when it needs to be addressed is far more common than the phenomenon of programmer’s abusing the idea to do whatever they want.
Can I offer “serverless”? I’m clearly far too long in the tooth since I’ve never understood how server-side apps could be serverless, but then maybe it’s just me?
Serverless means “you don’t need to manage servers”, not “servers don’t exist”. Given the absurdity of the latter premise, I would think the former premise would be pretty obvious, but clearly I’m mistaken since it’s such a commonly cited point of confusion.
if what it means is different from what is says it is just a stupid phrase. Be it by chance or purpose.
It’s not different from what it says. There are just multiple interpretations, and some literal-minded people cling to the least plausible.
I see, less is more.
Well, a lot of computing-involved folk lean toward literal-mindedness. I hear the term “serverless”, and my brain flags it as a logical contradiction. Serverless? That does not compute. I feel like that Nomad probe in the original Star Trek episode The Changeling. “Non sequitur, non sequitur, your facts are in error”. And as people go, I’m not usually all that literal minded.
Serverless sounds like a term cooked up by business people to sell yet another cloud thing.
Pure client-side web applications are definitely possible. and could rightfully be called serverless because they actually don’t need a server.
How does the web application get to the browser? :)
firefox /mnt/floppy/myapp.html
. ;)lol, my mother does it all the time. Just she uses
myapp.htm
by habit.touché
It’s you :)
If you use things like AWS Lambda, Google AppEngine, or even the venerable Heroku, you’re deploying your application “serverless” because while yes, SURE, there are servers churning along behind the curtain, you don’t get to see them, and in fact you don’t get to know they exist.
You upload your code and it starts magically running without any intervention on your part.
It’s about the abstraction presented to the deveoper, not the back-end implementation.
Yes. I don’t remember where I saw this, but it really should be called “on-demand.”
Under the “Object Oriented Programming” heading
I’m not sure if this is a joke or not, but people still use OOP all the time and there’s no consensus whatsoever as to its meaning (although curiously just about everyone who uses one of the meaning seems to insist that everyone agrees with their definition, even though in a given thread multiple definitions will be espoused). Moreover, I’ve very rarely heard people define it in terms of binding time, but usually it’s one of “sending messages”, “encapsulation”, “inheritance/classes”, or so on. There are still plenty of outspoken OOP devotees, and the only thing they seem to agree on is an affinity for the name. The lack of clear consensus about the name makes for some incredibly unproductive conversations.
I’m sure it’s a joke, but with some truth to it, in that we’re past peak OOP hype. Nonscientific but the ol’ ngram search matches my intuition pretty well that it was nigh-inescapable in the 90s, when it nearly always just meant “like C++ “ (and later Java into the 00s).
(Final edit: granted this may be related to my own immaturity and poor book selection as I was learning programming from the mid 90s to mid 00s.)
“sending messages” is late binding, because the object will do whatever it does when I send the message, and so whatever you get passed at runtime determines what you call instead of some compile time or link time thing.
How about we retire “waterfall”? If you don’t work for NASA, you’re probably not doing waterfall. If you’re not producing six design/spec documents per program, you’re not doing waterfall. Having a design is not waterfall. Having a design is not evil. Waterfall is not evil (it just suits a set of requirements that you probably don’t have). I wholeheartedly endorse “bastardized” workflows over intellectually-pure ones; it probably means that someone has made some concessions to the people who have to actually get the job done.
It’s a fun article. It’s always pleasant to take a shrimp fork to things that annoy us, but, let me use one of my own pet peeves as an example: “DevOps” only REALLY has any meaning as the idea put forth by Gene Kim et al in The Phoenix Project about breaking down silos etc.
The problem is, how does one properly refer to the much needed specialty in our industry involving infrastructure as code, deployments, and large parts of what used to be called release engineering?
Sometimes over and mis-used words itch because people abuse them but until and unless we find better labels for some of the things they point to we can’t jettison them IMO.
But back to our fun game of snarking at the industry in all its silly grandeur, can we toss “dead” onto the funeral pyre?
I’m so sick of hearing about Java/Perl/Fortran etc. being “dead” I could have a personal protein spill :)
Eh, there definitely are engineers that bridge the gap between product development, and product operations. While ideally every developer would think about the operational impact when developing features, not everyone can, and that’s fine as long as there are some that do think about that. Similarly, every operator would think about the impact for the developers when making an operational change, but not every one can, and once again, that’s fine as long as someone in operations can do it. Some people do both development and operations, and I think that they bring a lot of value for the team.
I agree with all of the above. But let’s talk about “DevSecOps” and friends next.
DevSecInfraQualityDataScienceOps :)
How did you find my LinkedIn?
Also, their shrimp-forking of “Agile” is one I can’t buy into either.
Sure, large swaths of the industry are throwing this term around without any care at all as to what it means, but despite all that, The Agile Manifesto still exists and has gravitas, and teams are using its principles as well as some of the tropes that have been built upon it to increase their velocity as well as the value they provide their stakeholders.
I’m old enough to remember the industry before it, and despite all the mis-use and abuse of the term, I’d hate like heck to give up sprints, stand-up and the spirit of nimble process that goes along with it.
I think this list is pretty bad and I’m pretty tempted to flag it as spam because low effort complaints about these terms appear so frequently.
This is dumb. If we’re not talking about Machine Learning, then you can usually understand how your AI makes decisions. Games are great examples of AI that is easy enough to reason about that you could trace the causes of every decision your AI makes.
If we are talking about ML, “doesn’t understand it” is phrasing I’d like to see retired. We know how these things work. You might not be able to trace the cause of every decision an ML algorithm takes, but they are not magic black boxes. There are plenty of non-DL algorithms that you can write by hand in a little bit of Java/Go/your-favorite-lang that can exhibit pretty complex behavior (kNN is an example). Sometimes I think people forget that “buzzwords” have buzz for a reason.
I agree that it’s bad, and I don’t think it should be here, but I’ve been told off for using the spam flag for that. My tactic in lieu of the “low quality” flag reason is to upvote everything else currently on the front page.
Sadly I see too many posts on here that I think would get flagged to hell if “low quality” was an option.
I’d love to kill the term “hack”. Most the time when people say something is a hack, all they mean they dislike it for some personal reasons. It is an empty, dismissive insult.
I’d love to keep being able to call awful workarounds and unsystematic solutions with a four-letter word. And as far as I can tell, most of the time people use it to label their own works, and for good reasons.
I agree. I’d add Deep Learning and Blockchain to the list.
Deep Learning also means too many things these days and quite often just Artificial Neuronal Networks, or is synonymous with Artificial Intelligence, which is mentioned in the article.
Blockchain is often meaning something different, like Proof of Work or weirdly enough sometimes central authority (for example in some cases of private blockchains), but most importantly Hash Chains, Merkle Trees, etc.
There’s also a list of words on harmful.cat-v.org.
I’d argue that sometimes words end up in such a tainted situation on purpose, through propaganda/marketing, both to add a positive or negative emotion, and to hinder productive discussion. See also Globalism vs Internationalism and especially for the first, where depending on where you come from it means Internet, easy travel, easy business, working together on research, etc. or exploitation of developing countries and their population, neo-colonialism, exploitation of natural resources for short-term profits, wage dumping, tax and law evasion, etc.
An example of marketing seems to be around VM, vServer and compute instance or (cloud) instance.
The term Crypto seems to partly suffer from a similar issue, though in that case context tends to make things clear.
I think it’s a relatively new phenomenon in IT though and one that I think would be important to counter-act, to make sure it remains easy to have meaningful discussions. I am not sure whether retiring is the right approach here, because it might lead to a situation where you can’t read any book or documentation a decade old and might cause generational divides in companies and communities.
Maybe some normative instance would help here? I’d say Wikipedia, but it’s not normative, but tends to show multiple meanings, sometimes different articles using words with different meaning, and in some cases even linking to an article not describing the implied meaning.
“Globalism” takes a much darker turn than you have mentioned. If you don’t know what I mean, ask anyone with three sets of parens around their Twitter username.
I thought the people with the triple parens around their username were Jewish? I thought the whole 3 parens thing started with antisemites imply that someone or something was Jewish or controlled by Jews or similar, and then Jewish people started using it to announce that they aren’t hiding from the antisemites (or something like that)?
I didn’t want to talk about globalism though, but to bring a short example. While both your response and what you write about seem to be a good perfect example how it is an emotional term I think it might lead to very off-topic discussions.