e dot dot dot
home << Policy << auto content moderation case study roblox tries to deal with adult content on a platform used by many kids 2020

Sat, 05 Jun 2021

Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020)
Furnished content.


Summary: Roblox is an incredibly popular online platform for games, especially among younger users. In 2020, it was reported that two-thirds of all US kids between 9 and 12 years old use Roblox, and one-third for all Americans under the age of 16. The games on the platform can be developed by anyone, as Roblox has set up a very easy environment, using the scripting language Lua, so that many of the games themselves are developed by Roblox's young users.Given the target market of Roblox, the company has put in place a fairly robust content moderation program designed to stop content that the company deems inappropriate. This includes all kinds of profanity and inappropriate language, as well as any talk of dating, let alone sexual innuendo. The company also does not allow users to share personal identifiable information.The content moderation extends not just to players on the Roblox platform, but to the many game developers that create and release games on Roblox as well. Roblox apparently uses AI moderation from a company called Community Sift as well as human moderators from iEnergizer. Recent reports say that Roblox has a team of 2,300 content moderators.Given the competing interests and incentives, there are both widespread reports of adult content being easily available (including to children) as well as developers complaining about having their content, projects, and accounts shut down over perfectly reasonable content, leading to widespread complaints that the moderation system is completely arbitrary.

Roblox is then left trying to figure out how to better deal with such adult content while simultaneously not upsetting its developers, or angering parents who don't want their children exposed to adult content while playing games.Decisions to be made by Roblox:Questions and policy implications to consider:Resolution: Roblox has continued to evolve and try to improve its content moderation practices. As this case study was being written, the company announced plans to start a content rating system for games, to better inform parents which games may be more appropriate (or inappropriate) for children. However, the company has been promising to improve its efforts to stop adult content from reaching children for many years -- and every few months more reports pour in. At the same time, developers who feel that their content has been blocked for no clear reason continue to take to forums to complain about the lack of clarity and transparency regarding the moderation system.Originally published on the Trust & Safety Foundation website.

Read more here


edit: Policy/auto___content_moderation_case_study__roblox_tries_to_deal_with_adult_content_on_a_platform_used_by_many_kids__2020_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study roblox tries to deal with adult content on a platform used by many kids 2020