I’m not really sure this is a good fit for here, but it is a long and thought-out article exploring how different companies have reacted to the needs of moderation and censorship in their businesses.
If nothing else, it’s a nice reminder that this kind of activity is very much a human one, and not an immediate win for algorithms.
There’s a lot here. Two points.
Some of the moderated content is “disturbing”. Incidental exposure to disturbing content (as typical users may experience) is survivable, but what are the ethical considerations for long term exposure to concentrated disturbance?
There’s been a decade of strife and hoopla over various policies, like no breastfeeding on Facebook. But users forget the first line of moderation is user reports. The mod team at Facebook banned your boobs because one of your “friends” reported the picture. Whether or not it’s right for Facebook to censor such images, it’s not just soulless corporate drones making these decisions. It’s people you know. It’s the guy from the tennis club, the woman from the office. They are the ones complaining.
Yes - it makes me wish that more online spaces afforded an option like “put it behind a content warning that specifies the nature of it” as a smaller hammer than “delete it”.
This article is really fantastic. I admit I had not given a ton of thought to who does content moderation and what the policies for it are. It was really eye opening to read.
A fascinating take, but seems to blindly assume that stricter moderation is always better, particularly for marginalised groups. I’ve said before that 4chan and Reddit’s policies and processes are much easier for many people to operate with (particularly lower-class people or those with less social sensitivity), and as a result they have been wonderful spaces for those categories of (often marginalised) people.
People with less social sensitively are not a marginalized group. A marginalized group of people are people who do not have access to what others in a society have access to, like the same quality education, housing, food, etc.
In any case, having strict moderation isn’t about protecting minorities, it is about setting the lowest common denominator for a community so everyone can participate. Its like vegan food, vegetarians and meat eaters can eat vegan food just fine, but vegans will not eat meat or eggs. So content moderation makes a community that the majority of people will feel comfortable in.
I do think that this is a dimension in which communities should vary. It leads to different types of interaction, and different audiences altogether.
I absolutely agree that the cultural norm should be towards more thought about the reader, compared to where it is presently. I don’t see moderation as the only tool to get there, but it is a tool.