e dot dot dot
home << Policy << auto content moderation case study telegram gains users but struggles to remove violent content 2021

Sat, 20 Mar 2021

Content Moderation Case Study: Telegram Gains Users But Struggles To Remove Violent Content (2021)
Furnished content.


Summary: After Amazon refused to continue hosting Parler, the Twitter competitor favored by the American far-right, former Parler users looking to communicate with each other -- but dodge strict moderation -- adopted Telegram as their go-to service. Following the attack on the Capitol building in Washington, DC, chat app Telegram added 25 million users in a little over 72 hours.Telegram has long been home to far-right groups, who often find their communications options limited by moderation policies that, unsurprisingly, remove violent or hateful content. Telegram's moderation is comparatively more lax than several of its social media competitors, making it the app of choice for far right personalities.But Telegram appears to be attempting to handle the influx of users -- along with an influx of disturbing content. Some channels broadcasting extremist content have been removed by Telegram as the increasingly-popular chat service flexes its (until now rarely used) moderation muscle. According to the service, at least fifteen channels were removed by Telegram moderators, some of which were filled with white supremacist content.Unfortunately, policing the service remains difficult. While Telegram claims to have blocked "dozens" of channels containing "calls to violence," journalists have had little trouble finding similarly violent content on the service, which either has eluded moderation or is being ignored by Telegram. While Telegram appears responsive to some notifications of potentially-illegal content, it also appears to be inconsistent in applying its own rule against inciting violence.Decisions to be made by Telegram:

Questions and policy implications to consider:Resolution: Telegram continues to take a mostly hands-off approach to moderation but appears to be more responsive to complaints about calls to violence than it has been in the past. As it continues to draw more users -- many of whom have been kicked off other platforms -- its existing moderation issues are only going to increase.Originally posted to the Trust & Safety Foundation website.

Read more here


edit: Policy/auto___content_moderation_case_study__telegram_gains_users_but_struggles_to_remove_violent_content__2021_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study telegram gains users but struggles to remove violent content 2021