1. 15
    1. 5

      “How do you feel about the deaths in Myanmar and India based on your creation?” “What we really want to do is fix the problem. We really want to get to solutions. I think getting to solutions is important.”

      Mark Zuckerberg is responsible for the deaths in Myanmar and India because someone used his service in the process? A huge service with a billion users doing all kinds of things in various jurisdictions and cultures? That’s ridiculous. He shouldn’t answer such an aggressive, blame-shifting question. He should’ve roasted the interviewer on the spot. He couldn’t for same image reasons that make him dodge things he’s actually guilty of.

      While we’re at it, are we supposed to hold the food and water suppliers guilty because they kept murderous criminals and governments not only alive but with the energy needed to do their dirty deeds? Should they have profiled every use of their products before letting people eat? Or are these people wanting to achieve popularity by assiging blame only expecting specific companies or groups to have this responsibility that everyone else dodges?

      1. 4

        The author of the post has responded to this reading of it.

        1. 1

          Has anyone asked TBL the same question? What was his response?

          1. 2

            The difference is that TBL can’t shut down the internet but Facebook could entirely stop the usage of their platform for this specific event, potentially preventing the massive scale of this (especially when combined with their ownership of WhatsApp)

            They could have just shut down their service in certain places! The info about how this stuff was being coordinated is out there. They could have stopped being the active communication platform for massive violence

            When the house is on fire you stop trying to take a nap on the couch

            1. 2

              That easy? What percentage of Indian WhatsApp messages were organizing mob killings?

              Why didn’t the Indian government simply cutoff access to WhatsApp? Isn’t it the governments job to do that?

        2. 1

          “Read it again, man. It was not an accusation. It was a question he flat-out refused to answer.”

          She was talking to a guy being accused of all kinds of stuff representing a public company where anything she says might be used against him. Even something good might turn into an article claiming he only said it for the publicity. Those people have to be careful. From that angle, what she said definitely comes off as a ridiculous accusation setting Zuckerberg up for a heated argument that won’t benefit him. That’s on her for asking such a question.

          “But he couldn’t even manage to spit out a vaguely-human sounding answer.”

          She was trolling him for a reaction but expecting Zuckerburg to act humanely. Pot calling the kettle black.

          If I was trolling him, I’d open with all the times he and Facebook convinced people privacy didn’t matter before he bought all the houses around his for privacy. “Have you changed your mind about the importance of users’ privacy, Mr Zuckerberg? What are your current thoughts about that?” There would be no contest he was responsible for his prior speech and recent purchases. At that point, he’d try to weasel out of it or give an interesting response. Maybe even a somewhat sincere one. Claiming he’s morally responsible for any possible use of an automated, global, free service is just going to lead to poker face and/or comments like mine.

          Edit: Read it earlier. Forgot jwz wasn’t interviewer but was defending the inteview. Fixed the attribution.

          1. 7

            The article is trying to capture an example of a common problem among engineers, managers, industry, anyone involved in “tech disruption”. These people and organizations (unconsciously, or willingly) try to disown any responsibility for the things they create and the effects they bring into the world. A much lower-profile example is Harvard researcher Hau Chan, who gave a presentation at a Feb. 2018 conference about using AI to profile potential gang members: when asked during the Q&A about racial bias in training data, or misuse of technology by oppressive regimes or police, he replied “I’m just an engineer” - prompting one attendee to shout “Once the rockets go up, who cares where they come down?” in his best Werner von Braun accent and storm out.

            Contrast that to Einstein, who felt quite conflicted and ultimately remorseful about his role in bringing forth the theories leading to the development of the atomic bomb. There are several other examples in the jwz thread.

            My read on this is that the interviewer is asking only half-sincerely, but it also isn’t meant as pure trolling. It’s an attempt to pin down exactly how far Zuckerberg feels his responsibility should extend for the world-disrupting platform he’s created. If he doesn’t feel this is his responsibility, fine, say so. What about election interference? What about helping to perpetuate cyber-bullying and hate crimes? How about fostering organization of white supremacist groups?

            As his testimony in Congress showed, he’s unwilling to commit on any level to anything even remotely close to responsibility. Which, IMO, fairly leaves him open to questions like this. Perhaps if he (or government regulation, or somebody) would put forth actual boundaries, this question would be moot.

            Similar questions could and should be asked of any Western company doing business in China, helping them prop up their state surveillance systems. And so on.

            1. 2

              Before I disagree, I want to say your comment makes a lot of sense and you said it all really well. Your goal is admirable. The problem is with the source and execution here.

              “My read on this is that the interviewer is asking only half-sincerely, but it also isn’t meant as pure trolling. It’s an attempt to pin down exactly how far Zuckerberg feels”

              It isn’t. The interviewer knows this if they’ve watched any interviews or studied mark Zuckerberg. Check this out. Zuckerberg turned down $15 billion dollars because he believed in his company that much and couldn’t be negotiated out of it. I give him props on that since I don’t think I could do it. That’s amazing, crazy, or something. He later went public after dominating that market. He’s rich as hell. He’s also recently paranoid due to elections being manipulated via his service plus tons of negative coverage. That’s the smart, calculating, careful asshole she should know she’s talking to if having done the tiniest bit of journalism.

              Then, she tries to “pin down” how he “feels” by accusing him of being responsible for some stuff in a foreign country he may or may not know about because his service was used at some point. She just keeps hitting him over and over with it by her own admission in a tactic that’s common in corporate media to smear opponents instead of really interview them. By using that tactic, she created an instant response in a guy whose dealt with both drama-creating and real journalists for many years now. That response was to know she was working his ass for ratings or just too incompetent for someone on his level. The next move was blocking her attempt whether he looked dumb or good doing it. One guy made a whole Presidency work out by pretending to be dumb. Aloof or dumb is a proven strategy in such situations where arguing back can be seen as “attacking” the interviewer or suggestive in some other way.

              “Contrast that to Einstein, who felt quite conflicted and ultimately remorseful about his role in bringing forth the theories leading to the development of the atomic bomb. There are several other examples in the jwz thread.”

              If Einstein didn’t, someone else would. Many were closing in. He felt conflicted but probably shouldn’t have. As I got older, I started leaning toward a belief that one shouldn’t feel bad about what other people do with their time/work or yours. It’s on them. Don’t make it easy to do harm if you can avoid it. Past that, don’t worry about it since scheming assholes are a natural part of humanity. It will keep happening whether it’s your work or someone else’s. The reason is the schemers are scumbags. Many are persistent one’s, too. Some are so smart and obsessive you could never stop them if you tried. My field, high-assurance security, specifically aims at these people. You’d go nuts if you equated your own responsibility to the successes of the best and brightest attackers. Gotta take it down a notch.

              “As his testimony in Congress showed, he’s unwilling to commit on any level to anything even remotely close to responsibility.”

              Like most Americans who don’t even participate in the political process or vote with their wallet. They mostly act apathetic and selfish. He does, too. He’s just doing it in a way with wide impact that made him a billionaire. I’m not grilling Zuckerberg any harder than the billion users of his surveillance-driven, always-in-the-media-for-scheming-bullshit service who made him rich in the first place. This is both a demand-side and regulatory problem. On the regulation side, there’s tons of voters making sure there aren’t any blocking abuse of data, elections, etc. So, my position is fuck the people responsible for creating one Zuckerberg, Henry Paulson, and Bill Gates after another since they don’t give a shit. Some mercy included for those that haven’t learned yet since they didn’t have the opportunity.

              Just saying you won’t change anything if the people and/or systems that keep creating these monsters keep creating these monsters without any changes.

      2. 2

        The interviewer doesn’t really think Zuckerberg is resonsible for genocide in Burma any more than he does. She’s not even trying to get an answer from him. She’s simply performing for her target audience, trying to show everybody that she’s a “real journalist” and willing to “stick it to the man” by trying to get him to say something that could cost the company a pile of cash.

        1. 1

          Well said. That’s another possibility that would lead to him having a non-reaction or rational evasion. It ties in with my claim, too, that she was pulling some kind of stunt.

      3. 2

        I don’t think jwz is recommending a retributive model of responsibility (wherein we punish people for making decisions that are only clearly poor in retrospect). Instead, looking at how facebook contributed to the deaths in Myanmar helps to identify red flags, so that facebook can behave differently next time. This is not really compatible with the purely-forward-looking policy Zuckerberg appears to be promoting – one where Facebook looks only at its own behavior in a vacuum and therefore is almost guaranteed to double down on flawed assumptions. Unexpected social behaviors can only be understood by taking societies into account.

        1. 1

          one where Facebook looks only at its own behavior in a vacuum and therefore is almost guaranteed to double down on flawed assumptions.

          That’s the model Facebook is supposed to follow. It’s called capitalism: each party being as selfish as possible maximizing its own gains externalizing all loses and otherwise not giving a shit. Something like half of America votes in favor of capitalism every year. Most of Facebook’s users, esp after many news articles, that know it’s a for-profit, publicly-traded, scheming, surveillance company are supporting its evil behavior by using it. Facebook’s incentives, forced by prioritizing shareholders, will ensure they always scheme on users more over time. The moral solution is to simply avoid Facebook as much as possible in favor of more ethical providers.

          There’s always been more private or morally-assuring ways to communicate. The market, both paid and free, massively votes against those providers in favor of scumbags like Zuckerberg. The choices of consumers and voters have collectively led to his dominance and wealth. As a rational capitalist, he should continue paying lip service while letting other people suffer or die. As utilitarians minimizing harm, jwz and I should be using Facebook as little as possible (necessary evil w/ family), putting as little personal info on there as possible, and sending messages through ethical, private services. Even if contacted on Facebook, we should reply back in another app if possible.

          This is what I do with being off Facebook having had an enormous, social cost to me. I’m actually going to have to get back on in future using it in ultra-minimal way as stated. Still, I’m standing up for my principles instead of just talking about principles. jwz whining that a capitalist running a publicly-traded, surveillance company should care about people more is just a foolish, publicity stunt. Even if Zuckerberg improves, his actions will make such an evil company look more desirable to current and future users perpetuating the evil instead of supporting non-invasive, ethical alternatives. It logically follows that jwz is a hypocrite if he isn’t pushing people off Facebook or to minimize it. Also, all the time he spends whining about Zuckerberg is time he could be promoting alternatives like Mastodon.

          EDIT: I’ve reposted a version of this comment on author’s blog, too. Let’s see what happens. Off to work now.

          1. 2

            I largely agree.

            However, as you mentioned, Facebook users are largely both aware of and accepting of the foundations of Facebook’s existence: they know that Facebook is a centralized capitalist enterprise that continues to exist only by selling their information, & they think all of that is worthwhile.

            It’s possible to want people off Facebook while also wanting Facebook to be more mindful of the way the remaining users are managed. At the same time, pointing to partial responsibility in political turmoil (and painting Zuckerberg in particular as uncaring), even if it’s a little misleading, is liable to get people off Facebook without needing them to recognize more general problems with capitalism (or even swallow ideas that would make them less comfortable with Google).

            Incremental change isn’t incompatible with radical change: when radical change isn’t viable, incremental change is all that’s possible, while incremental change can be pursued in conjunction with more radical experiments. (Having no safety net doesn’t encourage people to go all-in on risky propositions: it discourages them from taking any risks at all.)

            With regard to Mastodon: jwz writes a lot of blog posts & I have no idea whether or not he uses or promotes it. I get the impression that his readership is more culturally similar to lobste.rs than to mastodon: lots of people who work in tech, not a whole lot of gay communist furries. I’m not sure how well a big push from jwz would go. (There were big pushes from j3st3r & John McAffee, toward patched instances where federation was broken, and I don’t think either lasted longer than 48 hours. jwz is not j3st3r but neither is he the IWW.)

          2. [Comment removed by author]

    2. 5

      As I said in my comment at the end, I feel NOT ENTIRELY GREAT that something that I worked on played a part in fucking destroying democracy. That’s super-duper not what we had in mind. And I feel NOT ENTIRELY GREAT about that.

      I’m not familiar with this story, but I too have felt increasing guilt that anything I’ve worked may have enabled evil people to do evil things. The internet was not supposed to turn out this way.

      1. 2

        The first sentence of the author’s about page explains that they were “one of the founders of Netscape and Mozilla.org”.

        1. 2

          Thanks I’m familiar wth jwz, I was referring to the “lives lost” and the relation to facebook part of the post.

          1. 1

            The genocide of a muslim minority in myanmar/burma has been aided by the use of Facebook to deceive those fleeing violence into traveling to regions where they end up getting killed anyway.

        2. 1

          The only browser company that minimizes spying and selling out users to making money off search results which can be changed to DDG. I think what the author worked on has aided democracy and large masses of people if anything. The worst I can really say about Mozilla is their management sucks at listening to users and creating new products. They could keep cranking out new products that were private, user-owned data, and user-extensible by default for mix of free and paying users. They get lots of uptake due to being more trustworthy than competition. And so on.

          I can’t say Mozilla or Firefox was a force of evil, though. It’s done way more good.

    3. 3

      I find the engine problem vs. crime scene problem a nice analogy. Engineers can address engine problems. Crime scene problems need to be addressed by politics, preferably democratic (not the party). This is not new: the train, the car, electricity, etc. have increased the quality of life for many but also created problems which needed addressing by politics. This is basically a recognition of inventions having downsides, which is sometimes ignored in the silicon valley positivity. Then, a renegotiation of goals and qualities needs to take place. The engineer should be a responsible participant.

      1. 2

        I’m not sure I completely agree. Politics is also an engineering problem (or can be looked at this way): we can analyze what kinds of individual behaviors create the kinds of group behaviors we want or don’t want, & look at incentives and friction points. The key is to avoid reifying a process or technique as a goal in of itself.

        Capitalism goes out of control because people get emotionally attached to the idea of market efficiency & fail to notice pathological situations. Likewise, Facebook’s internal ethos is one that considers communication to be a good in of itself, and Facebook has had a really hard time recognizing when communication becomes pathological. Facebook & Twitter can produce never-before-seen levels of flatness in communication structures, and their inability to clearly see the failure modes of their own preferred techniques lets them continue to apply them (or encourage others to apply them, even as they are the best qualified to observe unintended side effects).

        This failure of recognition is common since it’s basically irrelevant in the absence of extreme power asymmetry: a stray Mark Zuckerberg can believe whatever he wants & he’ll have his actions cancelled out by somebody who believes the opposites, but a corporation can be a force multiplier for ideology while simultaneously protecting decision-makers from evidence of their failures.