e dot dot dot
home << Policy << auto content moderation case study facebook s ai continues to struggle with identifying nudity 2020

Sat, 12 Dec 2020

Content Moderation Case Study: Facebook's AI Continues To Struggle With Identifying Nudity (2020)
Furnished content.


Summary: Since its inception, Facebook has attempted to be more "family-friendly" than other social media services. Its hardline stance on nudity, however, has often proved problematic, as its AI (and its human moderators) have flagged accounts for harmless images and/or failed to consider context when removing images or locking accounts.The latest example of Facebook's AI failing to properly moderate nudity involves garden vegetables. A seed business in Newfoundland, Canada was notified its image of onions had been removed for violating the terms of service. Its picture of onions apparently set off the auto-moderation, which flagged the image for containing "products with overtly sexual positioning." A follow-up message noted the picture of a handful of onions in a wicker basket was "sexually suggestive."

Facebook's nudity policy has been inconsistent since its inception. Male breasts are treated differently than female breasts, resulting in some questionable decisions by the platform. Its policy has also caused problems for definitively non-sexual content, like photos and other content posted by breastfeeding groups and breast cancer awareness videos. In this case, the round shape and flesh tones of the onions appear to have tricked the AI into thinking garden vegetables were overtly sexual content, showing the AI still has a lot to learn about human anatomy and sexual positioning.Decisions to be made by Facebook:Questions and policy implications to consider:Resolution: The seed company's ad was reinstated shortly after Facebook moderators were informed of the mistake. A statement from the company raised at least one more question as its spokesperson did not clarify exactly what the AI thought the onions actually were, leaving users to speculate what the spokesperson meant, as well as how the AI would react to future posts it mistook for, "well, you know."
"We use automated technology to keep nudity off our apps," wrote Meg Sinclair, Facebook Canada's head of communications. "But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble."
Originally posted at the Trust & Safety Foundation website.

Read more here


edit: Policy/auto___content_moderation_case_study__facebook_s_ai_continues_to_struggle_with_identifying_nudity__2020_.wikieditish...

Password:
Title:
Body:
Link | Image | Paragraph | BR | Return | Create Amazon link | Technorati tag
Technorati tag?:
Delete this item?:
Treat as new?:
home << Policy << auto content moderation case study facebook s ai continues to struggle with identifying nudity 2020