1. 22
    1. 43

      How much will this save? A few days ago I did a quick calculation how much it would raise the US power usage if every household would have its own 30W server that’s powered on for 24/7 for a different discussion, and it was such an absurd small percentage of the total US power usage that I didn’t feel it was an important point after all. Unless my quick calculations were wildly off, I’m not so sure that “inefficient code” is a major contributor to power usage and carbon emissions, and is so small that it’s essentially in the noise.

      Of course, there’s a difference between inefficient code and not writing/running code in the first place; your Twitter bot was not “inefficient code”, you just decided to not run the code at all. I feel that quite a large numbers of servers could be shut down this very evening and very little of value would be lost (I’ll not name which to avoid controversy and/or an off-topic discussion), but that’s not really the argument you made.

      I also feel your moral reasoning is a little too simplistic; there are many things we do that can increase the suffering of others, ranging from playing music to driving cars to having dogs to what products you buy in the store and/or what food you eat (which goes far beyond not eating meat or animal products, by the way, but that’s a whole different discussion). Living a life that would not increase the suffering of others would be debilitating, and probably also undesirable. The question should IMO not be “does it increase the suffering of others?”, but rather “does it unreasonably increase the suffering of others?” Would a 0.1% difference in global carbon emissions be unreasonable? 0.01%? 0.0001%? There is no clear answer to this of course, but it does highlight the importance of knowing just how much of a difference this will make.

      And the most important point is that I don’t think that the “individual change in daily behaviour”-model to affect such large changes is very effective; I can’t think of a single example from history where it has. While I’m certainly sympathetic to the goals as such, I don’t think it’s an effective strategy for accomplishing much (even if it would be a non-negligible amount of emissions) and I think time and effort is best spent in other ways to solve this problem, such as better energy sources, government restrictions, or throwing those damn “rolling coal” trucks off a cliff.

      1. 14

        The question should IMO not be “does it increase the suffering of others?”, but rather “does it unreasonably increase the suffering of others?”

        It’s important to remember that this is a very particular moral philosophy, one known as utilitarianism, and there are many arguments both for and against it (as do most things in the Great Conversation). There are moral philosophies that disagree about viewing morality as an exercise in maximizing utility, such as Virtue Ethics or Divine Right Ethics.

        https://plato.stanford.edu/entries/utilitarianism-history/ is a good resource for this, and https://plato.stanford.edu is a fantastic resource on philosophy in general.

        1. 3

          There are moral philosophies that disagree about viewing morality as an exercise in maximizing utility, such as Virtue Ethics or Divine Right Ethics.

          I can highly recommend reading up on utilitarianism in the context of disability. There’s a case for it (some read it as encouraging shifting resources to disabled people instead of egalitarian approaches) and a drastic case against it (often interleaved with the “utility” of an individual being inspected). There’s quite some writing around.

      2. 6

        I work on a pretty busy (top 10k) site written in ruby. The entire production load could be handled comfortably by a single 1U server drawing well under 1kw. That’s got to be cooled, too, so lets call it 1.5kw.

        I mean, that’s not nothing, but rewriting it to draw less power would have comparable effect to wearing an extra jumper in winter so I could use less heating.

        1. 2

          Do you mean it’s handled by a single server right now, or that it could be if it was rewritten to a more efficient language?

          1. 2

            It’s handled by a small fleet of ec2 instances, which - in total - add up to about 1/2 of what you can pack into 1U.

      3. 3

        If we’re optimising on the level of a twitter bot as the poster hints at, we should probably talk about how much more waste is created just by flipping up your notebook, switching that energy sucker called “screen” on and then writing a tweet by hand.

        Also, the post ignores that there are indeed services to grade your energy usage, and people thinking long and hard about the problem.

        One person I can highlight is Chris Adams, who gave a good and practical overview of this at Heart of Clojure: https://www.youtube.com/watch?v=bTPzvX9-6VU

      4. 1

        Thanks for reading and giving such a considered response!

        I definitely agree that the point on suffering should probably be something more similar to what you state (“does it unreasonably increase the suffering of others?”), and I think you are right that the key is defining what does reasonable look like here? It is also tricky because what may be plausibly be reasonable for a single actor, becomes unreasonable if everyone does it. A good example is littering. It will have almost no effect on the environment if I threw away this plastic bottle. It definitely does have an impact if 7 billion people throw away their plastic bottles.

        With regards to how much you would save, the data does seem to suggest that the electricity usage of cloud computing is not insignificant. I think it follows a similar line of argument to the trash example. For an individual, optimising your code may not make much difference, but if the entire IT industry ruthlessly pursued efficiency (as measured my electricity usage) I believe it could have an impact.

        That said, I believe without systemic change as you described, it is a little like trying to hold back the tide. However, where we are currently at, that systemic change has not been made, so I believe it is our responsibility to behave responsibly with regards to the environment, or at the very least be aware of our impact.

        1. 5

          It seems to me that time (and thus money) spent on ruthlessly pursuing efficiency is probably better spent elsewhere if you want to make a difference, for example my ensuring those data centres use better energy sources, campaigning for political candidates/parties who take the problem seriously, lobbying with policymakers, putting pressures on companies, and so forth. Our time and attention is finite, so we need to prioritize where to spend it.

          However, where we are currently at, that systemic change has not been made, so I believe it is our responsibility to behave responsibly with regards to the environment, or at the very least be aware of our impact.

          Sure, and something like not having a car makes a meaningful impact, which is why I don’t have one. But this? I’m not so sure that it does.

    2. 11

      You forgot to account for what the code achieves and how that would be impacted by writing more efficient code.

      In your framing even efficient code would be worse than not writing and running it at all. Therefore the question is does it have positive impact that could outweigh its environmental costs and would spending time improving its efficiency instead of on something else be the best way to improve the positive outcome of the whole.

      1. 2

        That’s a really interesting point! A what point does the utility of the code outweigh the disutility of the electricity it provides? I guess you can think of it like so:

        UTILITY OF CODE - SUFFERING FROM ELECTRICITY USAGE = OVERALL UTILITY

        In this case, it would still be beneficial to increase the efficiency of your code, as that would reduce suffering and increase overall utility. There is probably an argument in here as well around how much utility you can produce per unit of work, maybe in 1 hour you can really increase the utility of the code, but only marginally increase the efficiency, so in that case you are probably better off focussing on the utility of the code.

    3. 20

      I am extremely amused by everybody else’s comments so far. The answer is an obvious “yes”, the author argues their point well, and this is but one facet of a deeper series of questions about the morality of producing code. Right now, in the world, people write code to:

      • Detonate explosives to send packages of explosives long distances through the air in order to kill and maim indiscriminately
      • Mine noisy data and extrapolate imagined racial demographics in order to artificially divide people up by ancestral responsibilities
      • Operationalize humans as fungible interchangeable disposable cogs in a nationalist machine which only values people according to their social credit scores
      • Surreptitiously collect data from unsuspecting users in order to apply psychological manipulations to warp people into docile and emotional consumers
      • Facilitate international money-laundering and organized crime by propagating Satoshi schemes as viable alternatives to already-deployed currencies

      On that final point, it’s widely argued that Bitcoin and other cryptocurrencies are extremely inefficient, and that that inefficiency partially manifests as a measurable draw on the world’s electricity grid and processor supply chains.

      I would suggest that the entire discipline of software development is still in an early stage of maturity, and that it will be a while yet before we are prepared to be true professional engineers. An important step in that development will be when we admit that we are dangerous in our craft and that our actions have pragmatic consequences in the real world.

      1. 21

        The answer is an obvious “yes”, the author argues their point well, and this is but one facet of a deeper series of questions about the morality of producing code.

        The problem is that in fact for many of us the answer is indeed not an obvious “yes”, and many of us don’t think the author argued it well at all.

        1. 0

          At my most charitable, I read you as being stubborn. Are you worried about the consequences of the morality of code authorship? Or do you think that the specific line of reasoning about climate change is not well-supported?

          1. 13

            Is an efficient implementation of code that runs an electric chair moral? Arguably not.

            Is an inefficient implementation of payments code that uses fewer resources than an efficient implementation of cryptocurrencies moral? Arguably so.

            Is the vast collection of inefficient, buggy, terrible code produced during the early career of any programmer while learning immoral? Almost certainly not–that’s how they learn.

            Is efficient code that which uses the least resources, or that which takes the least time–e.g., in the article the author refers to code that runs in minutes instead of hours iirc but i guarantee a slow ETL on a solar-powered Raspberry Pi is more moral (by other parts of the author’s definition) than an very fast implementation on a GPU that needs a coal plant to run.

            On and on like this. Is it more moral to write efficient code that is predicated on new hardware–instead of old code that runs slowly but doesn’t create extra e-waste?

            Indeed, what is moral? Is an efficient, resource-light program written in English acting as immoral cultural imperialism by forcing other folks to use it if they wish to save the planet? Is it immoral to run software that was written by somebody that killed their wife, even if it’s efficient in implementation?

            Is it moral to deprive others of software while making it as efficient as possibly? Unclear.

            Is it immoral to force developers to spend extra time on something to squeeze out that last percent of performance when they have other parts of the code to work on? What if the rest of the code is done, but they have kids and a family? What if they are on a trolley and there are 5 Oracle salesmen tied to the tracks but if they can save enough power then we can afford to electrocute an Oracle lawyer too?

            ~

            I don’t really care to debate these questions right now, merely to point out that they exist. When anybody points out anything is “an obvious yes”, and morality is involved, they’re probably assuming too much.

            As for the climate change stuff, all the author had to do was say “I assume as axiomatic that climate change is real and is bad for x, y, z reasons, let’s proceed”. Instead we get throwaway lines like “these are statements which are consistent with the overwhelming scientific consensus, and as such are uncontroversial” which is just…plainly incorrect: there is clearly lots of controversy (not that there should be, mind you, but here we are).

            There’s other stuff in there to nitpick, such as the core of the argument going out the window if we’re using renewable energy or nuclear–as mentioned earlier, a different formulation (“using extra resources is bad, period”) would’ve avoided this.

            Anyways, every time I see philosophy of ethics show up on Lobsters I generally expect to be disappointed.

            1. 0

              It’s really funny that you argue against a moral principle by presenting a series of handcrafted ethical situations designed to provoke exceptions. Your phrasing indicates that you grok the author’s original point and you simply want to poke holes in their word choice. Since you indicate that you won’t actually defend any of these questions, I will invoke Hitch’s razor to ignore them.

              You’re right; the author’s words are terrible. But so are all words. You indicate that you would have been convinced, had only the author used your words to persuade you. What that we should all be so lucky, to have the author reach out and use our words to persuade us! But of course they cannot. That’s not how words work!

              See neighboring comment for a way to generalize the argument from electricity, which may eventually become renewable, to time, which never can be renewed.

              No worries; I’m disappointed too. I was really hoping that you had a meaningful objection.

              1. 7

                I didn’t say the author’s words were terrible, just that I think the argument could’ve been presented differently and been more effective. I also didn’t say the arguing they did was bad, just that it was not well argued (this allows, for example, for argumentation that is merely okay).

                Also: questions don’t need defending–here, they were used to illustrate the point that there’s no “obvious” when talking about moral stances. By contrast–and by your own razor–I don’t believe you’ve actually met the burden of proof for the author’s position being “obviously” correct.

          2. 1

            What about inefficient code that offsets its own carbon emissions via other efficiencies? Is it less moral, but still a net positive? What if writing that efficient code took more resources to begin with.

            1. 1

              See the author’s second footnote. While e.g. a choice of public cloud providers or airlines can provide the opportunity to be carbon-neutral and to purchase offsets, it does not remove the overall ethical consideration. Instead of electricity, imagine that time is the resource being valued. After all, computation does take place over time; it’s not instantaneous. Additionally, any given human has a limited amount of time for experiences, and societies have a longer but still limited horizon for consolidating themselves into presentable histories. As a result, we might imagine a moral principle that devalues inefficient code as not electricity-wasting, but time-wasting.

              1. 1

                If you consider the resource being wasted as “human time”, then I don’t see how inefficient code is necessarily immoral. For instance, if a tool that I’m using takes an additional 5 ms to display the contents of a list, that time period is literally imperceptible to me - so where’s the moral harm? Moreover, if I run my own code, and it “wastes” some of my time, and my time only, is that immoral, given that I not only wrote the code that way, but chose to consume it?

      2. 7

        The answer is obviously ‘it depends’. The author’s argument is fallacious, because, as @markos points out in a parallel comment and you yourself support by giving various examples: it matters rather a lot what the code does.

        If you write code that results in energy use reductions equivalent to 1MW, then is it morally wrong to put that in production if it takes 3W to run when an optimal version could do with 2W? Independent of how long it would take to optimize it and what the benefits of other uses of that time would be? Independent of whether you are even aware it can be optimized? Independent of whether taking optimization into account would mean the feature would get axed and you wouldn’t get to write the code at all?

        The answer is only obviously ‘yes’ (or ‘no’) if you consider the question in a narrow context that makes the answer simple, but is unlikely to have anything to do with real world contexts.

        1. 1

          Yes, I understand your point; like you, and unlike the author, I am a pragmatist. I think that ethical decisions ought to be made by weighing the expected outcomes and effects of those decisions in the real world.

          However, the utilitarians are only wrong in their conclusion that utility is good; otherwise, though, their maths can be quite formidable. Here, the argument is relatively simple when stripped down: Computation requires electricity, and electricity is produced largely through unclean and damaging means. Even in the future, computation will still require electricity, and we will have a finite amount of clean renewable electricity which might not suffice for all of the computation which we wish to perform. In either case, we will have to weigh the decision to compute against the consequences of obtaining the electricity.

          Remember that proving that code is optimal is itself an expensive task which we usually cannot afford. Thus, while we can take it as a guiding moral principle to avoid inefficiencies, we cannot simply prescribe always writing optimal code as a solution.

      3. 7

        Detonate explosives to send packages of explosives long distances through the air in order to kill and maim indiscriminately

        Self-defense is not immoral. Making weapons that can be used for self-defense is not immoral. This includes weapons of war where the self has been extended to a tribe or a nation.

        Mine noisy data and extrapolate imagined racial demographics in order to artificially divide people up by ancestral responsibilities

        Please provide a concrete example, I’m not sure what are you referring to here specifically.

        Operationalize humans as fungible interchangeable disposable cogs in a nationalist machine which only values people according to their social credit scores

        This is a horrible mischaracterisation of the social credit score system. By the way, the US has a similar system, it’s called the credit score. Besides, the Chinese are allowed to govern themselves how they want. Please stop projecting your cultural value onto other people. That’s just cultural imperialism.

        Surreptitiously collect data from unsuspecting users in order to apply psychological manipulations to warp people into docile and emotional consumers

        If people are free to choose, then you are free to try to convince them. Just because a convincing technique is effective, doesnt make it immoral.

        Facilitate international money-laundering and organized crime by propagating Satoshi schemes as viable alternatives to already-deployed currencies

        Useful tools are also useful to immoral actors. That does not make the tool itself immoral.

        1. 2

          I said “explosives,” not “weapons.” Do you think that there should be safety guidelines which help prevent ANFO explosion accidents, like the recent ANFO explosion in Beirut? Then you have a moral stance on explosives safety.

          Concretely, the mathematics of pedigree collapse implies that humans only form one race. Thus, all racial profiling is illusory and artificial. I have taken apart racist technocratic positions before here, if you need examples of why it’s important to understand the imaginary nature of race. The phrase “mine noisy data” refers to the noise miners, a facet of modern pseudoscience where people imagine spurious correlations and artificially boost resonant signals in order to hallucinate causes and natural laws.

          I did not say “Chinese;” which of us is “projecting,” I wonder? On the other hand, “social credit score” is the only way to properly evoke what’s happening; “reputation” would not have worked, as the specific phrase “reputation system” is not yet popular in discourse. (This is despite my poor attempts.) You’re correct that this is happening worldwide and in many different forms, but you’re incorrect that the vast majority of the Chinese people are currently experiencing democratic governance (i.e. “govern themselves how they want”), and so I’m not sure if there’s a coherent path forward for this line of argument.

          Speaking of reputation systems, Facebook is a good example of a reputation system which collects user data even when users do not consent. For example, Facebook tracks people who are not logged in. This should make you question whether people are really “free to choose” whether they participate. Meanwhile, Facebook is complicit in too many anti-democratic actions to list quickly; a special highlight has to go to their role in the Rohingya genocide.

          Money laundering is usually criminalized not because it is morally bad to handle money, but because it is morally bad to abet criminal behavior. And while I could understand the argument that not all crimes are immoral, the fact is that the sort of crimes which need lots of money to commit are usually quite harmful! Cryptocurrencies facilitate money laundering, on the scale of billions of USD/year.

          1. 3

            I said “explosives,” not “weapons.”

            In the prior description

            Detonate explosives to send packages of explosives long distances through the air in order to kill and maim indiscriminately

            That is clearly a description of some form of a weapon system. Specifically you were referring to people ‘writing code. Nobody wrote code to detonate the ANFO explosion in beirut. So I don’t know why you are pulling that up rather than addressing your actual argument and its counterpoint.

            Concretely, the mathematics of pedigree collapse implies that humans only form one race.

            I don’t see how that follows at all.

            I did not say “Chinese;” which of us is “projecting,” I wonder?

            You talked about social credit scores. That’s a Chinese system.

            Are reputation systems immoral?

            but because it is morally bad to abet criminal behavior. And while I could understand the argument that not all crimes are immoral,

            If some crimes are not immoral, why would it be immoral to abet those crimes?

        2. 1

          By the way, the US has a similar system, it’s called the credit score.

          The US credit score is completely and totally different than the CCP’s social credit system - they’re not similar at all.

          The US government cannot change your credit score at all, while the CCP can (and regularly does) change citizens’ social credit scores to silence dissidents who have committed nothing that a reasonable observer would judge as “immoral” - such as speaking out against the government.

          Your US credit score is only involved with credit-related transactions (e.g. buying a house, which is a reasonable thing for it to be involved with), while the CCP’s social credit score can prevent you from traveling, which has absolutely nothing to do with credit, or getting accepted into universities.

          Your US credit score is based on your behavior alone, while the CCP’s social credit score depends on what your friends and relatives do so that citizens are discouraged from even associating with dissidents.

          Your US credit score is only affected by financial transactions, and only some of them (e.g. buying a large item on a debit card or with cash has no effect on your credit score), while your CCP social credit score is affected by speaking out against the government (or official government positions, like the non-existence of Tienanmen Square), playing or even just buying too many video games, minor traffic infractions, and smoking in non-smoking areas.

          The two are completely, categorically different, with different goals and implementations, and you can’t even pretend that they’re “similar” at all.

      4. 4

        Detonate explosives to send packages of explosives long distances through the air in order to kill and maim indiscriminately

        Surely it still depends where the explosives are targeted. You say indiscriminately, but really you mean indiscriminately in a radius to where it was targeted.

        I don’t automatically agree all weaponry is immoral, because they can be used to stop immoral actions. Maybe there are further arguments for this I don’t understand.

        1. 1

          Explosives are indiscriminate and untargeted. The fact that they were flung in a certain direction should not reassure you that they will be safe.

          I am not saying that weaponry is immoral. I am saying that programmers explicitly write code to support military operations, and that those programmers know that they are writing such code.

      5. 3

        Every single one of those points is a political question, in the sense that people disagree on fundamentally political grounds about whether it is moral or immoral for programmers to create programs that do these things. In this respect programming is no different from any other engineering discipline. If you think it is good for your country to wage war on another country (perhaps the other country is morally-equivalent to Nazi Germany), then you will be in favor of programmers writing code to facilitate sending explosives to kill indsicriminantly in that country, just like you will be in favor of chemical engineers fabricating the explosive chemicals and rocket scientists designing the rockets to carry them. If you think the war is bad (perahps you think your country is Nazi-equivalent), why should you care more about the programmer than the rocket scientist or the chemical engineer?

        I don’t think that programmers are blind to the idea that the craft of programming has pragmatic consequences in the real world; indeed, I think that is true to exactly the same extent that it is true of any other human activity. But I think programmers differ intensely among themselves about what real-world consequences are good or bad (again, same as any other group of people). Someone who thinks that Bitcoin and other cryptocurrencies are good for the world isn’t going to be persuaded that the existence of cryptocurrency software implies the discipline of software development is “immature”, or be prepared to accept legal or social restrictions on what code can be legitimately written on the basis of that supposed immaturity.

        1. 2

          You’ve got it backwards: Politics is what happens when people coordinate to address policies, those collaborative decisions which affect everybody. The different moral backgrounds of different people contribute to, but are not the sole cause of, political discussions. The reason that moral philosophy makes any progress at all is not because it directly slams against the political barriers, but because most moral philosophers broadly agree on the intuition of what is good, and only differ on the details, the formal arguments, and the operationalization. (Also, because Ayn Rand existed, we cannot discount somebody as a moral philosopher simply because they are odious.)

          I don’t particularly care if people who endorse cryptocurrencies are swayed by my position statement, for example. They already know that they are doing harm to the world; they just don’t care. And no person can force another person to care.

      6. [Comment removed by author]

    4. 7

      I think this post attempts to boil a complex problem down too simply and ignores reasons why people might purposely write and run inefficient code for engineering or business purposes. It also possibly makes it immoral to be poor, since smaller and less well funded teams often use quicker and less optimized technical solutions to save time to build products quicker.

      1. [Comment removed by moderator pushcx: Dragging in a giant unrelated hot-button topic like covid is just trolling.]

    5. 6

      Morality is a fuzzy concept to be thrown casually into propositional logic. The benchmark for what is or is not immoral is not well defined.

      Given this assertion, it would seem reasonable that if running code produces greenhouse gasses, and greenhouse gasses contribute to global warming, then running code contributes to global warming.

      Premise five seems relatively uncontroversial, though to simplify let us just consider the length of time some piece of code takes to run (in a similar way we assess algorithmic complexity). If an efficient version runs in 1 minute, and an inefficient version runs in 1 hour, it stands to reason that the inefficient version would consume more electricity.

      If all code in existence today were rewritten in the most climate efficient form such that its functionality was not changed, how much less resources would be used?

      1. 2

        Read https://en.wikipedia.org/wiki/Jevons_paradox

        TLDR: efficiency allows for increased usage.

    6. 6

      I’ll rather present a different fake dilemma to moralize over. What if efficient code is less inclusive? What is more important, reducing CO2 emissions, or more diversity in tech?

      ps: if you find this question ridiculous and stupid, you will feel what I feel about the constant flow of these “micro-optimalizations will save the planet” bullshit posts that keep flowing recently. Unless you work on something really large scale this is just bullshit. If you work on something such, it is economically rewarded anyway by reducing cost. If you want to do to save the planet, and you think saving a few kilograms of CO2 emission can change megatrends, then don’t use a car, but ride a bike.

      1. 4

        It’s such a massive waste of combined cognitive energies, these pseudodiscussions about the moral or climatological implications of software; that’s the real inefficiency.

    7. 5

      Lotta conceptual problems here. I love to see people writing, so yay. I’m not commenting to argue or even critique.

      However I think it’s good for me to publicly remember some definitions I use. Perhaps these definitions can help others.

      Morality: my relationship from my heart to some kind of higher power, be it logic, the universe, the Greek gods, the Great Pumpkin, or whatnot. Everybody has their own morals, and these morals are uniquely owned by them. Frankly, they’re none of anybody else’s business.

      Ethics: our agreement as a group of like-minded folks of standards we hold one another to such that the rest of society knows what they’re getting when they interact with our group. Doctors have ethics, as do plumbers and chess players. Groups create and enforce their own ethics.

      Legality: the minimum set of rules we all need no matter our morals or ethics in order to get along with one another in a happy and productive society. It can be legal and not moral, or moral and not legal. Same goes for ethics. All of these concepts exist independently of one another.

      Definitions of common terms and ideas: inherently unstable, which is why there have to be the three concepts above. Moral definitions are owned by each person. Ethics are owned by the associated social group. Legal definitions are a complex matter that varies depending on the government.

      So it might be morally wrong for you to do something, but frankly I don’t care. Same goes for things I find morally wrong. You shouldn’t care either. The key question to me, then, seems to be “Should it be ethically wrong to write inefficient code”. (As one commenter has already pointed out, if that’s true then it’s also ethically wrong to write code that shouldn’t be written. As such code actually increases the complexity in the coding universe, it could very well be much worse than simply inefficient code. You have to not only think of the CPU cost of the loop, but all of the downstream, second-and-third order effects also)

      My point: It’s easy to say whether something should be moral or not. It’s easy to get upset about what’s moral or immoral. But once you realize that you’re really talking about ethics or legality? Suddenly a very emotional and precious topic becomes a completely different kind of problem.

      That’s what works for me. YMMV.

      ADD: For the record, I think the technology development community is long overdue for a conversation about ethics, so sign me up for that. But that conversation should be about ethics, not morality. Understanding the difference changes the nature of that conversation from being yet another pointless internet yelling match to something we might all come to some kinds of agreement about.

    8. 5

      Not killing yourself is immoral, since killing yourself is the most efficient way to reduce your own greenhouse gas emission contribution.

      1. Therefore, increasing your consumption of computing resources increases your contribution to the suffering of others.

      Increasing my consumption of computing resources increases the welfare of myself. If other’s welfare is important, mine must be too.

    9. 5

      This is an example how politics and activism leaks into programming.

      1. 2

        Leak? Programming is imbued with the politics of the people writing the code or paying others to write code. There’s no politics-free programming. This is just an example of how hard it is to show programmers the political implications of their actions.

        1. 2

          What are the politics of me, sitting at home, writing some given program out of my own volition?

          1. 3

            At the very least, trying to be “apolitical” means passively supporting the status quo. If there is anything morally wrong with the status quo (there is), not doing anything to change it means you have taken the side of the oppressor.

            There are many, many other decisions in the world of software that have political impact, but I think the most important principle is that “if you choose not to decide, you still have made a choice.” There is always a default political effect to any action, a list of unstated assumptions behind your behavior, that you can only override by choosing to be “political”.

          2. 1

            you will is not independent from the political activities of other people and your right to have a home, to have the time to write software in your free time and the idea that you should be allowed to write (or publish) software in your personal time are political preconditions to your writing, that are not guaranteed to people different from you. The fact that you can and others can’t is due to political and social conditions and the fact that you will spit out some software and a kid in a poor neighborhood with the passion for computers won’t because he has to work 3 shitty jobs to stay afloat is politics. And this reflects on the outcome of the software that will be produced. Access to technology and access to technological production are not decided by individuals, but by a system that is as it is due to the political desires of a few people.

    10. 3

      It’s a tricky question, and measuring on the basis of GHG emission is interesting.

      A fun place where you could argue efficiency is immoral is crypto code. Many cryptographic operations can be implemented extremely efficiently - elliptic curves, rsa, etc - but making them efficient results in potential for completely breaking any operation. The various constant time options are often on the order of 10x slower, and there are rsa specific tricks that “only” triple the cost. So strictly speaking it could be immoral to talk about a safe crypto system if you were using “efficient” crypto.

      I’ll ignore cryptocurrencies. etc because, well cryptocurrency are designed specifically to adjust to always maximize carbon footprint. (they phrase it slightly differently, but given the various networks are now using as much power as many countries this is pretty much an indisputable fact now).

      There’s a good case to be made that more efficient code means a system remains usable longer, so users don’t feel the need to replace otherwise perfectly functioning devices. That’s an environmental win on par with not having power adapters ;D

    11. 3

      That we, as software engineers, are talking about these issues and developing a dialog is a huge issue in itself. Big +1 for the Lobsters community for fostering the discussion.

      I think the argument is well constructed and the post well written. Yet as mentioned by the top comment (from @arp242), it doesn’t seem to take into account the impact of writing more efficient code versus the impact of other similarly beneficial activities.

      As far as I can see, spending time advocating for, designing or learning about solutions to climate change on a political or social level is likely more effective than whatever we’re able to do from inside a text editor; basically trying to solve the problem at the right level of abstraction. For me, that somewhat invalidates the whole discussion, though I can see how it’s relevant to talk about minimising our climate impact as participants in the software industry.

      But even if you want to keep your focus to being code related, if your objective is to minimise total suffering on ~utilitarian grounds, then I strongly suspect that time invested in other activities such as learning about tech ethics or how technology impacts democracy would pay off more in terms of their positive impact. Now more than ever, the code we write has real impact on how society operates; I propose that even in cases where our moral impact is relevant, the morally relevant effects of our code will arise from what it does rather than how long it takes to run.

    12. 3

      I wonder, can you think of any example in history where individual change has had any impact on the environment? This, versus systemic changes / policies set by the government, or innovation. But mostly innovation. I might be wrong, but in my opinion you cannot shame billions of people into compliance, even if the arguments are solid.

      Performance is a budget. If you have performance budget to spend, you’re likely to spend it somewhere else (e.g. on more features). This is what happens with people getting an electric car — because electricity from the grid is cheaper than gas, they are more likely to take extra trips, never mind the production cost of all those new cars. Putting a number on what impact this will have, is very important. In the case of veganism for example, I’ve read numbers (not sure how accurate) that by turning vegan, an individual can only save a couple of bucks in carbon emissions per year. And even if that still seems significant, those savings in carbon emissions will get reallocated to industry.

      Cloud computing is interesting, because we could build apps that then get executed on thousands of virtual servers in the cloud. But code is just code. This could even be construed as a freedom of speech issue. How that code is used is the responsibility of those that run it.

      And inefficiency is relative. If you write a very inefficient script, but run it only for a couple of minutes, and it saves you hours in manual labor, that’s a net win for the environment, no? And if you optimize that inefficient script, but spend an hour doing so, then you’ve wasted an hour, with all associated costs.

      And even if you can say how much we’d save, we wouldn’t run inefficient code for long if it was expensive to do so. As long as people are paying the bill for running that code, I don’t see the problem. Note that we’ve reached this point only because the cost to the environment has been effectively subsidized. The carbon emissions were ignored, and thus not factored in the price. Because we can put a price on carbon emissions, and countries have started doing that.

      So if running inefficient code, in cloud computing or on personal devices, increases carbon emissions significantly, the obvious answer is to increase the price of electricity. The price of cloud computing is very sensitive to fluctuations in the price of electricity, naturally. And even in residential areas people would be more careful with their electrical appliances if their bill shot up after a certain threshold. And note that this will happen anyway, it already does, and is a far better strategy.

      Give people a carbon emissions budget, and they’ll spend it wisely, a budget that gets spent anyway.

      Global warming increases the suffering of others

      Note that it’s not so simple. People often forget that burning fossil fuels is what feeds and keeps warm billions of people. It’s hard to stop burning fossil fuels, not only because we’ve got no viable alternatives, but also because shrinking the economy does lead to an increase in poverty, and poverty is what actually increases the suffering of others. During this pandemic the carbon emissions went down, but world hunger is going up again due to the economic slowdown.

      Humanity can deal with increased water levels in 50 to a 100 years from now, it can deal with fluctuations in weather, and with warmer climates. We can probably deal with most of its effects, because those effects won’t happen over night, and we can spend resources on building seawalls or whatever else is needed. Of course, if we keep going on our current path, we could end up with a mass extinction event, maybe similar to the Permian–Triassic extinction, so let’s hope we won’t reach that phase. But no, global warming does not increase the suffering of others right now, poverty does.

      The elephant in the room is that the population is growing, and all effects of global warming are inevitable, unless we develop clean and very efficient energy sources, to sustain this population growth. Personally I don’t see a way out without nuclear energy, and I think a lot of innovation in this space will come from tech companies looking into driving the cost down of their electricity bill, eventually investing in research. So basically we’ll have to innovate, to sustain the population growth, which is what humanity has always done.

      And I don’t think shaming people into using resources more efficiently works. Quite the contrary, I think it polarizes society in a way as to prevent reaching consensus on what to do next.

    13. 3

      One important issue I want to raise is the distinction between:

      • being more efficient at a task you already were doing
      • doing less / fewer tasks

      It is dangerous to entirely rely on energy efficiency because of Jevons paradox, which states that when people are given more energy efficient technology, it does not reduce energy use because people simply use the technology more.

      Doing less is the most reliable way to reduce energy usage / emissions, but also the most difficult (changing people’s expectations/habits is harder than changing machinery) and morally complex (e.g., if you’re an environmental group, reducing your effectiveness in organizing to address the climate crisis would be unacceptable). Nevertheless, in the developed world even people living in energy poverty are currently living above the carrying capacity of the planet, so we must make systemic change to how people meet their basic needs. In the context of programming / web design, doing less means eliminating features, triage to choose the most important features and budgeting resources carefully.

      I believe that fully addressing the climate crisis will require using green energy, becoming more energy efficient, AND doing less. (And, of course, organizing for massive political change to win a Green New Deal so that nobody is left behind in the climate crisis, but that is not directly related to programming.)

    14. 2

      In this day and age anything that wastes energy contributes unnecessarily to global warming. And there are certainly cases where improvements to code will decrease energy usage so we should definitely try to be efficient. But the worst thing is programs that run NOOP in a loop to slow themselves down so that customer pays for an upgrade, and all they do is remove the NOOP loop. That’s a form of planned obsolescence and it’s pure evil. Never do that.

    15. 2
      • Inefficient code could be written to make other processes more efficient and therefore be a net win.
      • The inefficiency in the code may be due to other constraints (such as time or immediate resources), while still solving a larger problem.
      • Inefficient code might be more efficient overall if it was designed to run with costs already incurred.

      I don’t think this argument universally holds true.

    16. 2

      Assumption: the answer is yes. Then it follows that using programming languages like Ruby, Python, Perl or Lua are morally wrong to use because they do more than is strictly necessary to do their job, just by their nature. It would be morally correct to write all code in Rust (C, C++, or assembly language) and use the best data structure for processing (minimizing big-O) since those languages have no (or very little) overhead in doing what they do.

    17. 1

      It’s morally wrong to breath. Every breath you take billows CO2 in to the air, bringing us one step closer to runaway global warming.

    18. [Comment removed by author]

    19. 1

      I think there is a buried over-simplification of the term efficient. From the perspective of a bare metal coder, the entire tech-stack of the modern SaaS world is itself massively inefficient. Big-O efficiency might be lost in the noise.

      Of course, if you accept my premise, then O(n^2) in a massively inefficient environment might be a multiplier on the moral wrong. 🤷

    20. 1

      I wonder, if you accept the premise, what the moral status is of using an inefficient algorithm, when no better one is known. If the best way to solve a problem is bruteforce, then does the moral value of using this algorithm depend on the prospect that a better solution might be discovered, and do you have a responsibility to find it?

      Also, do you have a responsibility to use efficient code? What does that mean in our world of browsers and would it be more morally responsible (from this perspective) to not insist of cryptography or redundency checks, when it isn’t necessary?