e dot dot dot
home << Policy << auto before you talk about how easy content moderation is you should listen to this

Wed, 22 Aug 2018

Before You Talk About How Easy Content Moderation Is, You Should Listen To This
Furnished content.


For quite some time now, we've been trying to demonstrate just how impossible it is to expect internet platforms to do a consistent or error-free job of moderating content. Especially at the scale they're at, it's an impossible request, not least because so much of what goes into content moderation decisions is entirely subjective about what's good and what's bad, and not everyone agrees on that. It's why I've been advocating for moving controls out to the end users, rather than expecting platforms to be the final arbiters. It's also part of the reason why we ran that content moderation game at a conference a few months ago, in which no one could fully agree on what to do about the content examples we presented (for every single one there were at least some people who argued for keeping the content up or taking it down).On Twitter, I recently joked that anyone with opinions on content moderation should first have to read Professor Kate Klonick's recent Harvard Law Review paper on The New Governors: The People, Rules and Processes Governing Online Speech, as it's one of the most thorough and comprehensive explanations of the realities and history of content moderation. But, if reading a 73 page law review article isn't your cup of tea, my next recommendation is to spend an hour listening to the new Radiolab podcast, entitled Post No Evil.I think it provides the best representation of just how impossible it is to moderate this kind of content at scale. It discusses the history of content moderation, but also deftly shows how impossible it is to do it at scale with any sort of consistency without creating new problems. I won't ruin it for you entirely, but it does a brilliant job highlighting how as the scale increases, the only reasonable way to deal with things is to create a set of rules that everyone can follow. And then you suddenly realize that the rules don't work. You have thousands of people who need to follow those rules, and they each have a few seconds to decide before moving on. And as such, there's not only no time for understanding context, but there's little time to recognize that (1) content has a funny way of not falling within the rules nicely and (2) no matter what you do, you'll end up with horrible results (one of the examples in the podcast is one we talked about last year, explaining the ridiculous results, but logical reasons, for why Facebook had a rule that you couldn't say mean things about white men, but could about black boys).The most telling part of the podcast is the comparison between two situations, in which the content moderation team at Facebook struggled over what to do. One was a photo that went viral during the Boston Marathon bombings a few years ago, showing some of the carnage created by the bombs. In the Facebook rulebook was a rule against "gore" that basically said you couldn't show a person's "insides on the outside." And yet, these photos did that. The moderation team said that they should take it down to follow the rules (even though there was vigorous debate). But, they were overruled by execs who said "that's newsworthy."But this was then contrasted with another viral video in Mexico of a woman being beheaded. Many people in Mexico wanted it shown, in order to document and alert the world of the brutality and violence that was happening there, which the government and media were mostly hiding. But... immediately people around the world freaked out about the possibility that "children" might accidentally come across such a video and be scarred for life. The Facebook content moderation team said leave it up, because it's newsworthy... and the press crushed Facebook for being so callous in pushing gore and violence... so top execs stepped in again to say that video could no longer be shown.As the podcast does a nice job showing, these are basically impossible situations, in part because there are all different reasons why some people may want to see some content, and others should not see it. And we already have enough trouble understanding the context of the content, let alone the context of the viewer in relation to the content.I've been seeing a microcosm of this myself in the last few days. After my last post about platforms and content moderation around the Alex Jones question, Twitter's Jack Dorsey was kind enough to tweet about it (even though I questioned his response to the whole mess). And, so for the past week or so I've been getting notified of every response to that tweet, which seems pretty equally divided between people who hate Alex Jones screaming about how Jack is an idiot for not banning Jones and how he's enabling hate mongers, and people who love Alex Jones screaming about how Jack is silencing dissent and how he's a liberal asshole silencing conservatives.And no matter where on the spectrum of responses you may fall (or even totally outside of that spectrum), it should come down to this: we shouldn't be leaving these decisions up to Jack. Or Mark. Yes, those companies can and must do a better job, but what people fail to realize is that the job we're asking them to do is literally an impossible one. And that's why we really should be looking to move away from the situation in which they even need to be doing it. My solution is to move the controls outwards to the ends, allowing individuals and third parties to make their own calls. But there may be other solutions as well.But something that is not a solution is merely expecting that these platforms can magically "get it right."

Permalink | Comments | Email This Story


Read more here


edit: Policy/auto___before_you_talk_about_how_easy_content_moderation_is__you_should_listen_to_this.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto before you talk about how easy content moderation is you should listen to this