I was an undergrad at UC Davis and I knew Professor Rogaway, the author of this talk! Very smart and thoughtful. He strongly discouraged me from going into machine learning as a career because he said that he did not see it as a career which it was possible to do good things within. I ended up going into AI applied to computational biology and hopefully I am able to do good things within that but I think about his comments sometimes. He is also largely responsible for giving the computer science department at UC Davis its own ethics class which focuses on the specific global impacts of computer science, seperate from the engineering ethics class which talks about plagurism and such. I think that class did a lot of good for the students graduating UC Davis.
seperate from the engineering ethics class which talks about plagurism and such
I once complained to a fellow ethics professor about the naming of such “professional ethics” classes which are not about ethics at all but rather about what rules someone requires you to follow, and they said “They have to call them something”!
Anyone know of other places to read/follow him? His UCD page seems dormant. I can understand why he might stay away from the socials, but I’d like to see more of what he’s thinking.
There’s a lot of ideas being expressed in this paper, many of which I think are either factually incorrect or ethically incorrect.
The climate crisis is here. The biodiversity crises. 6th mass-extinction. Pandemic disease. Huge wildfires.
Tipping points. And with these things: social, political, and economic turmoil; civilizational collapse. For young
people: the future is bleak.
These things are either ills as old as civilization itself (pandemic disease, civilizational collapse), or are related to the entire project of technological modernity that 1) also has lead to huge, unprecedented-in-human-history improvements in living standards and alleviation of suffering, and 2) aren’t particularly related to computer science, except insofar as computer science is one of the many fruits of technological modernity. The COVID-19 pandemic that he seems to be referencing was not great, but I’m incredibly glad to have lived through that pandemic with the benefit of modern technology - including but not limited to computer technology - rather than, say, the Plague of Justinian, or the Black Death, or any smallpox epidemic in the history of humanity up until we eradicated smallpox in 1979.
Global warming due to greenhouse gas emissions is a problem, but that’s also not particularly related to computer technology, and in any case no one wants to live in a world where greenhouse gas emissions are lessened because gasoline costs $100/gallon and that price is passed along to everything in the modern economy for which gasoline is an economic input.
I’m really not a fan of the “Why does techno-optimism dominate?” slide, which is basically a bunch of lame joke references that have something vaguely to do with technologies. I’m personally largely a techno-optimist because I recognize that the world I was born into, 300-odd years into the project of the industrial revolution and technological modernity, is one that provides vastly better health, wealth, comfort, and safety to more people than had ever happened for the majority of human history before it; that this is almost entirely due to the fact that better technology lets humanity harness more energy and protect ourselves better against an uncaring universe; and that since we still live in a world with lots of human suffering and imperfection, we ought to continue with this project and create even better technologies to increase the amount of human safety and freedom that exists in the world.
“Plastics” are actually very good - you aren’t consciously aware of all the times you failed to get a disease that would’ve killed you had you contracted it in the 17th century, because our society pervasively uses plastic wrap to mitigate bacterial contamination of things like food and antibiotics. Plus all the other things plastics are useful for.
Radical CS recognizes that CS — and technology more broadly — embeds values. It is never neutral. It rearranges power. It has tended to disproportionately empower big corporations, tech workers, and the elite. Doing so, it creates significant peril for people and the planet.
It’s true that technology embeds values, but the specific values that any given technology embeds are vague and unconnected enough that it’s difficult to make meaningful generalizations. I don’t think it’s true at all that computer science disproportionately empowers big corporations, tech workers, or the elite, or that it creates significant peril for people that wouldn’t exist in its absence.
Computer science is a huge collection of things that is hard to make any specific generalizations about, and I think if Rogaway provided specific examples of ways that he thinks CS empowers big corporations or the elite, it would be pretty easy to think of examples of CS disempowering those same demographics (CS empowers “tech workers” in the trivial way that if CS is useful to people, the people who understand it best will both be able to use it most effectively and are likely to get a job working with it as opposed to something else, but that doesn’t seem bad. Lots of people can learn to use or work with computers).
So you might think that cryptographers would be ashamed and aghast about mass surveillance revelations. You’d be wrong. My community thinks things are going great, and that mass surveillance is not our concern.
This seems pretty straightforwardly wrong to me. Cryptographers seem like the sort of people who are most concerned about mass surveillance (which I think of as being prototypically the Edward Snowden revelations about US government spying from the early 2010s). The response from cryptographers that I saw was to increase work that was already happening on developing free software tools to allow ordinary people to send messages to each other that even state actors couldn’t decrypt, and also promoting the desirability of such tools to the public at large.
And indeed, over the past decade or so since the Snowden revelations a lot more software tools that ordinary people use have strong encryption built in, and encryption tools have better user experiences than they used to. The fact that I can video chat with my parents using Signal, and that this works pretty seamlessly, is a major anti-surveillance victory that cryptographers brought about.
Encourage students to feel, not think
I think this is a really bad principle to use when trying to teach people to act ethically. When people feel instead of think, they often make counterproductive decisions because they can’t see and aren’t trying to see all aspects of a problem.
Enriched version of slide deck with notes, PDF
I was an undergrad at UC Davis and I knew Professor Rogaway, the author of this talk! Very smart and thoughtful. He strongly discouraged me from going into machine learning as a career because he said that he did not see it as a career which it was possible to do good things within. I ended up going into AI applied to computational biology and hopefully I am able to do good things within that but I think about his comments sometimes. He is also largely responsible for giving the computer science department at UC Davis its own ethics class which focuses on the specific global impacts of computer science, seperate from the engineering ethics class which talks about plagurism and such. I think that class did a lot of good for the students graduating UC Davis.
I once complained to a fellow ethics professor about the naming of such “professional ethics” classes which are not about ethics at all but rather about what rules someone requires you to follow, and they said “They have to call them something”!
lmao call them “ethically questionable legal complience for beginners”
EXACTLY
Anyone know of other places to read/follow him? His UCD page seems dormant. I can understand why he might stay away from the socials, but I’d like to see more of what he’s thinking.
Direct link to pdf
There’s a lot of ideas being expressed in this paper, many of which I think are either factually incorrect or ethically incorrect.
These things are either ills as old as civilization itself (pandemic disease, civilizational collapse), or are related to the entire project of technological modernity that 1) also has lead to huge, unprecedented-in-human-history improvements in living standards and alleviation of suffering, and 2) aren’t particularly related to computer science, except insofar as computer science is one of the many fruits of technological modernity. The COVID-19 pandemic that he seems to be referencing was not great, but I’m incredibly glad to have lived through that pandemic with the benefit of modern technology - including but not limited to computer technology - rather than, say, the Plague of Justinian, or the Black Death, or any smallpox epidemic in the history of humanity up until we eradicated smallpox in 1979.
Global warming due to greenhouse gas emissions is a problem, but that’s also not particularly related to computer technology, and in any case no one wants to live in a world where greenhouse gas emissions are lessened because gasoline costs $100/gallon and that price is passed along to everything in the modern economy for which gasoline is an economic input.
I’m really not a fan of the “Why does techno-optimism dominate?” slide, which is basically a bunch of lame joke references that have something vaguely to do with technologies. I’m personally largely a techno-optimist because I recognize that the world I was born into, 300-odd years into the project of the industrial revolution and technological modernity, is one that provides vastly better health, wealth, comfort, and safety to more people than had ever happened for the majority of human history before it; that this is almost entirely due to the fact that better technology lets humanity harness more energy and protect ourselves better against an uncaring universe; and that since we still live in a world with lots of human suffering and imperfection, we ought to continue with this project and create even better technologies to increase the amount of human safety and freedom that exists in the world.
“Plastics” are actually very good - you aren’t consciously aware of all the times you failed to get a disease that would’ve killed you had you contracted it in the 17th century, because our society pervasively uses plastic wrap to mitigate bacterial contamination of things like food and antibiotics. Plus all the other things plastics are useful for.
It’s true that technology embeds values, but the specific values that any given technology embeds are vague and unconnected enough that it’s difficult to make meaningful generalizations. I don’t think it’s true at all that computer science disproportionately empowers big corporations, tech workers, or the elite, or that it creates significant peril for people that wouldn’t exist in its absence.
Computer science is a huge collection of things that is hard to make any specific generalizations about, and I think if Rogaway provided specific examples of ways that he thinks CS empowers big corporations or the elite, it would be pretty easy to think of examples of CS disempowering those same demographics (CS empowers “tech workers” in the trivial way that if CS is useful to people, the people who understand it best will both be able to use it most effectively and are likely to get a job working with it as opposed to something else, but that doesn’t seem bad. Lots of people can learn to use or work with computers).
This seems pretty straightforwardly wrong to me. Cryptographers seem like the sort of people who are most concerned about mass surveillance (which I think of as being prototypically the Edward Snowden revelations about US government spying from the early 2010s). The response from cryptographers that I saw was to increase work that was already happening on developing free software tools to allow ordinary people to send messages to each other that even state actors couldn’t decrypt, and also promoting the desirability of such tools to the public at large.
And indeed, over the past decade or so since the Snowden revelations a lot more software tools that ordinary people use have strong encryption built in, and encryption tools have better user experiences than they used to. The fact that I can video chat with my parents using Signal, and that this works pretty seamlessly, is a major anti-surveillance victory that cryptographers brought about.
I think this is a really bad principle to use when trying to teach people to act ethically. When people feel instead of think, they often make counterproductive decisions because they can’t see and aren’t trying to see all aspects of a problem.
Is there a recording?
This is the recording of the workshop, the very first talk is the one that was referred to here. https://www.nist.gov/video/third-nist-workshop-block-cipher-modes-operation-day-1