Content Moderation Case Study: Facebook's Internal 'Hate Speech' Guidelines Appear To Leave Protected Groups Unprotected (June 2017)
Furnished content.
Summary: Facebook has struggled to moderate "hate speech" over the years, resulting in it receiving steady criticism not only from users, but from government officials around the world. Part of this struggle is due to the nature of the term "hate speech" itself, which is often vaguely-defined. These definitions can vary from country to country, adding to the confusion and general difficulty of moderating user content.
edit: Policy/auto___content_moderation_case_study__facebook_s_internal__hate_speech__guidelines_appear_to_leave_protected_groups_unprotected__june_2017_.wikieditish...