ay back in the mid-1990s, when the web was young and the online world was buzzing with blogs, a worrying problem loomed. If you were an ISP that hosted blogs, and one of them contained material that was illegal or defamatory, you could be held legally responsible and sued into bankruptcy.
Moderation, however, has two problems. One is that it’s very expensive because of the sheer scale of the problem:on Instagram every day. Another is the way the dirty work of moderation is often outsourced to people in poor countries, who are– for pittances. The costs of keeping western social media feeds relatively clean are thus borne by the poor of the global south.
The platforms know this, of course, but of late they have been coming up with what they think is a better idea –Even if Meta employed half a million human moderators it wouldn’t be up to the task There are two ways of answering this. One is via HL Mencken’s observation that “For every complex problem there is an answer that is clear, simple, and wrong.” The other is by asking a cybernetician.is the study of how systems use information, feedback and control to regulate themselves and achieve desired outcomes.
There are really only two ways to deal with it . One is to choke off the supply. But if you do that you undermine your business model – which is to have everyone on your platform – and you will also be accused of “censorship” in the land of the first amendment. The other is to amplify your internal capacity to cope with the torrent – which is what “moderation” is. But the scale of the challenge is such that even ifemployed half a million human moderators it wouldn’t be up to the task.