e dot dot dot
home << Policy << auto content moderation case study newsletter platform substack lets users make most of the moderation calls 2020

Thu, 20 May 2021

Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020)
Furnished content.


Summary: Substack launched in 2018, offering writers a place to engage in independent journalism and commentary. Looking to fill a perceived void in newsletter services, Substack gave writers an easy-to-use platform they could monetize through subscriptions and pageviews.As Substack began to attract popular writers, concerns over published content began to increase. The perception was that Substack attracted an inordinate number of creators who had either been de-platformed elsewhere or embraced views not welcome on other platforms. High-profile writers who found themselves jobless after crafting controversial content appeared to gravitate to Substack (including big names like Glenn Greenwald of The Intercept and The Atlantic's Andrew Sullivan), giving the platform the appearance of embracing views by providing a home for writers unwelcome pretty much everywhere else.A few months before the current controversy over Substack's content reached critical mass, the platform attempted to address questions about content moderation with a blog post that said most content decisions could be made by readers, rather than Substack itself. Its blog post made it clear users were in charge at all times: readers had no obligation to subscribe to content they didn't like and writers were free to leave at any time if they disagreed with Substack's decisions.But even then, the platform's moderation policies weren't completely hands off. As its post pointed out, the platform would take its own steps to remove spam, porn, doxxing, and harassment. Of course, the counterargument raised was that Substack's embrace of controversial contributors provided a home for people who'd engaged in harassment on other platforms (and who were often no longer welcome there).Decisions to be made by Substack:

Questions and policy implications to consider:Resolution: The controversy surrounding Substack's roster of writers continued to increase, along with calls for the platform to do more to moderate hosted content. Subtack's response was to retirate its embrace of "free press and free expression," but also offered a few additional moderation tweaks not present in its policies when it first received increased attention late last year.Most significantly, it announced it would not allow "hate speech" on its platform, although its definition was more expansive than policies on other social media services. Attacks on people based on race, ethnicity, religion, gender, etc. would not be permitted. However, Substack would continue to host attacks on "ideas, ideologies, organizations, or individuals for other reasons, even if those attacks are cruel and unfair."Originally posted to the Trust & Safety Foundation website.

Read more here


edit: Policy/auto___content_moderation_case_study__newsletter_platform_substack_lets_users_make_most_of_the_moderation_calls__2020_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study newsletter platform substack lets users make most of the moderation calls 2020