1. 16
  1.  

  2. 5

    I think he leaves out a major benefit of assuming good faith: if people in tech generally genuinely want to help, that means that appealing to that desire is tactically effective (in a way that money might not be). So, you can change social norms by pointing out that current ones are harmful and suggesting better ones. None of that is the case if you’re targetting people unmotivated by the welfare of others.

    (In a community where people generally act in good faith, social capital is based in part on whether or not people are perceived to be acting in good faith, so even sociopathic people who don’t actually care about the greater good can be convinced to act in such a way that they work toward the greater good, since doing otherwise ruins them and removes them from power.)

    1. 5

      I like several bits of this and dislike several other bits, but one thing that stands out to me is the seemingly continual refusal to place responsibility on the people consuming and using tech:

      When software encourages us to take photos that are square instead of rectangular, or to put an always-on microphone in our living rooms, or to be reachable by our bosses at any moment, it changes our behaviors, and it changes our lives.

      Users volunteer for this. Most of the people taken advantage of by tech do so by sleepwalking, like lemmings, into the grim meathook future some of us create to monetize them. Nobody holds a gun to their head and says “Put Amazon Echo in your house or you get shot by the Bezostruppen.” Nobody says “Hey you should totally enter a multiyear contract for this smartphone that will bleed you dry and spy on you instead of using a cheapo burnerphone or else we will put you in jail.” There is no national law that says “Cititzen, you must participate in the two-minutes hate on Twitter or else your voting privileges will be revoked.”

      There is no end of the trouble we get into if we ignore the actions, the real actions, that got us here.

      1. 11

        Interestingly, lemmings don’t actually walk to their death as their environment typically only contains lakes they can swim across. Put them in front of an ocean though…

        Which is actually the perfect metaphor. Put people into environments they are unfamiliar with, maladapated to, and unable to even ask the right questions and it isn’t all that surprising that they won’t ultimately act in their interests.

        But I guess we can blame the people who software hurts for being hurt by that software, which they couldn’t hope to understand without deep study.

        1. 4

          When there is minimal consumer choice, it’s hard to blame consumers for making the wrong choice. Robust consumer choice would be something close to feature-by-feature optionality: smartphones without spy powers, or only with photo spy powers, for example. In point of fact, even a smartphone with a hardware keyboard is a non-option nowadays.

          This is in due in no small part to the limits and strengths of mass manufacturing: if everyone buys the smartphone that’s good for 51% of the people, we all enjoy a better phone for less money – but that puts the power of feature selection out of consumers’ hands. They get the phone that the designers designed: take it or leave it.

          The responsibility – moral and otherwise – for those features rests squarely with those who made the phone, not those who bought it.

          1. 4

            Nobody says “Hey you should totally enter a multiyear contract for this smartphone that will bleed you dry and spy on you instead of using a cheapo burnerphone or else we will put you in jail.”

            No, sure. But (to take just this example) the contract and the undeniable benefits of the smartphone, obviously without any reference to any potential downsides, are what’s advertised, sold, heavily pushed, to the extent that many won’t even realise there’s an alternative - and when availability of the features and capabilities provided are normalised to the extent that getting by without them involves significant extra effort, then in the majority sections of world outside of “people who understand, and can either afford or have to spend significant parts of their time understanding, technology”, that’s effectively all that exists.

            1. 1

              Exactly. Ill add this is true even when the constraints between two solutions are similar enough that the safer/quality/free-er one requires no sacrifice or less. Getting people to switch from texts to IM… important since texts were a downgrade from IM (esp with delays)… was hard despite equivalent usability, better thing being free, better thing having more features (optional though), some being private, and so on.

              An uphill battle even when supplier went above and beyond expectations making a better product for them. Usually laziness or apathy was reason when other factors were eliminated.

            2. 1

              I’m always confused with this fusion between technology and societal impact. A couple things, technology is definitely a tool and thereby always morally neutral. Although, when used by people and context the morality will change. What does this mean? It’s common sense that people should be at the forefront at solving social problems NOT technology. As for the ethical training for programmers, except for the few of us working on critical systems, no one wants ethical responsibility. Imagine if every commit you presented a legally binding stamp to a code of conduct that you have verified that the system is free of any system failures and any violation will cost you a hefty fine or worse. Of course, there are benefits to this, like most other professions, the supply of programmers will rapidly decrease and controlled. In addition, code quality will likely go up.

              The reality is that a huge majority of our field if we make a mistake, no one bleeds. It’s a huge advantage in our line of work in that if you make a mistake it’s usually not a big deal. In addition to that, we’re as well paid as other profession who don’t have the same benefit. I suspect this is also something most programmers enjoy.