e dot dot dot
a mostly about the Internet blog by

home << Policy << auto content moderation case study suppressing content to try to stop bullying 2019

April 2024
Sun Mon Tue Wed Thu Fri Sat
 
       

Thu, 08 Oct 2020


Content Moderation Case Study: Suppressing Content To Try To Stop Bullying (2019)

Furnished content.


Summary: TikTok, like many social apps that are mainly used by a younger generation, has long faced issues around how to deal with bullying done via the platform. According to leaked documents revealed by the German site Netzpolitik, one way that the site chose to deal with the problem was through content suppression -- but specifically by suppressing the content of those the company felt were more prone to being victims of bullying.The internal documents showed different ways in which the short video content that TikTok is famous for would be rated for visibility. This could include content that was chosen to be featured (i.e., seen by more people) but also content that was deemed Auto R for a form of suppression. Content rated as such was excluded from the for you feed on Tiktok after reaching a certain number of views. The for you feed is how most people view TikTok videos, so this rating would effectively put a cap on views. The end result was the reach of content categorized as Auto R was significantly limited, and completely prevented from going viral and amassing a large audience or following.What was somewhat surprising was that TikTok's policies explicitly suggested putting those who might be bullied in the Auto R category -- even saying that those who were disabled, autistic, or with Down Syndrome, should be put in this category to minimize bullying.

According to Netzpolitik, employees at TikTok repeatedly pointed out the problematic nature of this decision, and how it was discriminatory itself and punishing people not for any bad behavior, but because of the belief that their differences might possibly lead to them being bullied. However, they claimed that they were prevented from changing the policies by TikTok's corporate parent, ByteDance, which dictated the company's content moderation policies.Decisions to be made by TikTok:
  • What are the best ways to deal with and prevent bullying done on the platform?
  • What are the real world impacts of suppressing the viral reach of any content based on the type of person making the content?
  • Is it appropriate to effectively prevent those you think will be bullied from getting full access to your platform to prevent the possibility of bullying?
  • What data points are being assessed to justify the assumptions being made about Auto R being an effective anti-bullying tool?
Questions and policy implications to consider:
  • When there are strong pushes from policymakers to platforms that they need to stop bullying will it lead to unintended consequences like the effective minimization of access to these platforms by potential victims of bullying, rather than dealing with the root causes of bullying?
  • Will efforts to prevent a bad behavior be used to really sweep that activity under the rug, rather than looking at how to actually make a platform safer?
  • What is the role of technology intermediaries in preventing bad behavior?
Resolution: TikTok admitted that these rules were a blunt instrument that were put in place rapidly to try to minimize bullying on the platform -- but that the company had realized it was the wrong approach and had implemented more nuanced policies:
"Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy," he told the BBC."This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong."We have long since removed the policy in favour of more nuanced anti-bullying policies."
However, the Netzpolitik report suggested that this policy had been in place at least until September of 2019, just three months before its reporting came out in December of 2019. It is unclear exactly when the more nuanced anti-bullying policies were put in place, but it is possible that they came about due to the public exposure and pressure from the reporting on this issue.

Read more here

posted at: 12:00am on 08-Oct-2020
path: /Policy | permalink


0 writeback(s)

comment...

 
Name:
URL/Email: (optional)
[http://... or mailto:you@wherever]
Title: (optional)
Comments:
Please enter the anti-spam code shown below: 

home << Policy << auto content moderation case study suppressing content to try to stop bullying 2019

April 2024
Sun Mon Tue Wed Thu Fri Sat
 
       


Categories
 - blog home

 - Announcements  (0)
 - Annoyances  (0)
 - Career_Advice  (0)
 - Domains  (0)
 - Downloads  (3)
 - Ecommerce  (0)
 - Fitness  (0)
 - Home_and_Garden  (0)
     - Cooking  (0)
     - Tools  (0)
 - Humor  (0)
 - Notices  (0)
 - Observations  (1)
 - Oddities  (2)
 - Online_Marketing  (0)
     - Affiliates  (1)
     - Merchants  (1)
 - Policy  (3743)
 - Programming  (0)
     - Bookmarklets  (1)
     - Browsers  (1)
     - DHTML  (0)
     - Javascript  (3)
     - PHP  (0)
     - PayPal  (1)
     - Perl  (37)
          - blosxom  (0)
     - Unidata_Universe  (22)
 - Random_Advice  (1)
 - Reading  (0)
     - Books  (0)
     - Ebooks  (0)
     - Magazines  (0)
     - Online_Articles  (5)
 - Resume_or_CV  (1)
 - Reviews  (2)
 - Rhode_Island_USA  (0)
     - Providence  (1)
 - Shop  (0)
 - Sports  (0)
     - Football  (0)
          - Cowboys  (0)
          - Patriots  (0)
     - Futbol  (0)
          - The_Rest  (0)
          - USA  (0)
 - Technology  (1204)
 - Windows  (1)
 - Woodworking  (0)


Archives
 -2024  April  (140)
 -2024  March  (179)
 -2024  February  (168)
 -2024  January  (146)
 -2023  December  (140)
 -2023  November  (174)
 -2023  October  (156)
 -2023  September  (161)
 -2023  August  (49)
 -2023  July  (40)
 -2023  June  (44)
 -2023  May  (45)
 -2023  April  (45)
 -2023  March  (53)


My Sites

 - Millennium3Publishing.com

 - SponsorWorks.net

 - ListBug.com

 - TextEx.net

 - FindAdsHere.com

 - VisitLater.com