1. 10

  2. 6

    From one of the sources linked in the article (https://www.iea.org/reports/data-centres-and-data-transmission-networks):

    “Rapid improvements in energy efficiency have, however, helped limit energy demand growth from data centres and data transmission networks, which each account for ~1% of global electricity use.”

    I was pretty surprised to see that the usage for data transmission was as high as that for datacenters. Reading further, that turns out to be mostly because mobile networks/cell towers.

    I did some research of my own and found this: (https://www.telecompetitor.com/study-5g-has-90-better-energy-efficiency-than-4g/):

    “5G networks are up to 90% more energy efficient per traffic unit than 4G networks, according to a new 5G energy efficiency study from Nokia and Telefonica.”

    I’m not sure how accurate/unbiased that is, but unless it’s a straight up lie it seems that 5G is a huge improvement to energy efficiency compared to 4G. In the article, the author actually lists the change from 4G to 5G as an example of “maximalism” in technology and.

    One other thing I noticed was that they spent a few paragraphs discussing the end of Moore’s law and listing it as a reason that we can’t just count on hardware to keep energy usage flat like it has in the past. However, as a programmer with a big interest in efficiency, I would point out that there are huge improvements in software as well that are and will continue to help as well:

    • More efficient modern compression formats for data and images such as WebP, AVIF, JPEG XL, AV1, Brotli, etc. dramatically improve compression ratios for all kinds of media
    • Modern serverless technologies like Lambda and Cloud Run as well as orchestration layer solutions like K8s help maximize hardware utilization and reduce waste
    • Edge computing like Cloudflare Workers reduces network hops and reduces the amount of space between the users and the data
    • Switches to ARM CPUs from x86 in both servers and user devices provides much greater energy efficiency and performance per watt in most cases
    • Google Cloud shows the carbon cost of various datacenters and helps users choose less carbon-intensive datacenters for their applications if they want to

    The author includes this sentence near the end of the page:

    “A radical, large-scale change is urgently needed.”

    I think that this really isn’t true for one, and also that there already is tons of work going on from developers and companies big and small to work towards greater efficiency and environmental friendliness in software and hardware. Of course, I think that making decisions with the environment and sustainability in mind is the obvious choice. However, I feel that when it comes to the growth and expansion of technology (especially software), the result is usually positive in that regard. Happily, using less power and computing resources almost always means less cost, so that will always be a driving force towards less - especially if Moore’s law does break down and slow hardware’s incremental advances.

    I will personally be embracing the growth and expansion of the web and technology in general. I’m sure there is some point in the near or distant future where we will hit a wall - either cultural, social, technological, or otherwise - that halts our progress forward, but I see no signs of it currently. While there is beauty and goodness to be found in the small, local, tangible, and manageable, the possibilities of a free, open, growing, and unlimited shared virtual world are impossible to match, in my opinion.

    1. 6

      I think that what underlies their article is that Jevons paradox has held true thus far and there’s no reason to think that it won’t continue to. Efficiency gains are leading to greater consumption, not less. And the extra consumption is rarely essential or important – if it was then our limited resources would have already been put to use doing it.

      1. 4

        Individual 5G base stations really do consume less electricity, but we’ll need many more of them compared to 4G. Not sure if it will turn out to still consume less energy.


      2. 4

        My problems with this article are neatly summed up by this quote:

        The Maximalism of the games industry is also reflected in changes in language. For example, in the 1980s, the word “gaming machine” (Finnish: pelikone) was often a pejorative - referring to a machine that was not suitable for serious use. Games were therefore an application that even the smallest microcomputer was capable of. In the 21st century, a ‘gaming machine’ is more likely to be expensive, powerful and in need of constant upgrades. Similarly, we might consider the term ‘good graphics’, which in the past referred, for example, to skilfully drawn and esthetically pleasing pixel images and well-programmed graphics routines, but which in the 21st century has come to refer more to the technical graphics capabilities of the hardware and whether a game supports them.

        “Good graphics” was always about the technical capabilities, and gaming machine vendors have always competed on how good their hardware was. People made “skillfully drawn pixel images” because they were trying to push against the limits of the weak hardware, not because they aesthetically valued pixel images. As soon as they got better hardware, they switched to more complex graphics.

        And they switched because consumers wanted that. This article frames “maximalism” as if it’s the evil companies pushing this on a hapless society, when all of these things are the companies responding to the demands of users. People like nice things! And it’s been this way long before computers. Is “maximalism” to blame for society replacing their radios with TVs, and then their TVs with color TVs?

        1. 2

          Recently read this too, and to link a potpourri of thoughts: