e dot dot dot
home << Policy << auto content moderation case study pinterest s moderation efforts still leave potentially illegal content where users can find it july 2020

Sat, 12 Sep 2020

Content Moderation Case Study: Pinterest's Moderation Efforts Still Leave Potentially Illegal Content Where Users Can Find It (July 2020)
Furnished content.


Summary: Researchers at OneZero have been following and monitoring Pinterest's content moderation efforts for several months. The "inspiration board" website hosts millions of images and other content uploaded by users.Pinterest's moderation efforts are somewhat unique. Very little content is actually removed, even when it might violate the site's guidelines. Instead, as OneZero researchers discovered, Pinterest has chosen to prevent the content from surfacing by blocking certain keywords for generating search results.The problem, as OneZero noted, is that hiding content and blocking keywords doesn't actually prevent users from finding questionable content. Some of this content includes images that sexually exploit children.While normal users may never see this using Pinterest's built-in search tools, users more familiar with how search functions work can still access content Pinterest feels violates its guidelines, but hasn't actually removed from its platform. By navigating to a user's page, logged-out users can perform searches that seem to bypass Pinterest's keyword-blocking. Using Google to search the site -- instead of the site's own search engine -- can also surface content hidden by Pinterest.Pinterest's content moderation policy appears to be mostly hands-off. Users can upload nearly anything they want to with the company only deleting (and reporting) clearly illegal content. For everything else that's questionable (or potentially harms other users), Pinterest opts for suppression, rather than deletion.Generally speaking, we limit the distribution of or remove hateful content and content and accounts that promote hateful activities, false or misleading content that may harm Pinterest users or the public's well-being, safety or trust, and content and accounts that encourage, praise, promote, or provide aid to dangerous actors or groups and their activities, Pinterest's spokesperson said of the company's guidelines.Unfortunately, users who manage to bypass keyword filters or otherwise stumble across buried content will likely find themselves directed to other buried content. Pinterest's algorithms surface content related to whatever users are currently viewing, potentially leading users even deeper into the site's "hidden" content.Decisions to be made by Pinterest:

Questions and policy implications to consider:Resolution: Pinterest's content moderation strategy remains mostly unchanged. As the site's spokesperson stated, the site appears to feel the hiding of content addresses most raised concerns, even if it does allow more determined site users to locate content the site would rather they never saw.

Read more here


edit: Policy/auto___content_moderation_case_study__pinterest_s_moderation_efforts_still_leave_potentially_illegal_content_where_users_can_find_it__july_2020_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study pinterest s moderation efforts still leave potentially illegal content where users can find it july 2020