e dot dot dot
a mostly about the Internet blog by

December 2020
Sun Mon Tue Wed Thu Fri Sat
   
   


Cyberpunk 2077's Stream-Safe Setting Option For Its Music Failed To Keep Streamers Safe

Furnished content.


In November, as we were finally coming to the day when CD Projekt Red's newest opus, Cyberpunk 2077, was going to be released to the world, we wrote about how the developer had included a setting for the game specifically to keep streamers safe from copyright strikes. Essentially, the setting was meant to strip out all licensed music from the game and replace it with music that wouldn't land streamers in copyright jail while doing let's-plays. On the one hand, it was nice to see a developer so in favor of having its games streamed do this sort of thing. On the other hand, the fact that CD Projekt Red had to do so showed both what a failure Amazon/Twitch and the like have been at supporting their streamers through music licensing deals and, more importantly, what a hellscape copyright enforcement has become that all of this was even necessary.Well, as it turns out, that hellscape is so complete that even the game's stream-safe setting failed to keep streamers safe.

The developer first warned potential streamers on Wednesday, before Cyberpunk 2077 officially launched in all regions, that a certain song (CDPR didn’t say which one) during the game’s “Braindance” sequences might trigger a Digital Millennium Copyright Act strike. That’s even if you’re using the specific in-game setting designed to toggle off copyrighted music for this exact reason.But now CDPR says that the issue may be larger than it first realized, and it’s now advising streamers turn off in-game music entirely due to “additional instances in the game which might put a DMCA strike on your channel.” CDPR says a fix is on the way, but it’s not an ideal situation to have to disable all music (both copyrighted and original tracks) when streaming the game just to avoid tripping the automated detection systems that protect copyrighted works.
This is all immensely stupid. I'm sure some out there will want to blame the developer for this, with suggestions that it didn't roll out its stream-safe music setting well enough. But that's dumb. CD Projekt Red is trying to navigate this idiotic minefield, but because of the failings of streaming platforms combined with the absurdly strict culture of the music industry, it's very, very difficult to pull off.Wouldn't it be easier if we all just admitted that hearing music, licensed or otherwise, playing in the soundtrack of a game being streamed isn't a damned threat or replacement for the actual original music? Nobody was going to out to buy "Track X" from iTunes only to hear it on a let's-play and decide instead not to. That isn't a thing.Instead, we have this absurd reality to deal with.

Read more here

posted at: 12:00am on 12-Dec-2020
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



Content Moderation Case Study: Facebook's AI Continues To Struggle With Identifying Nudity (2020)

Furnished content.


Summary: Since its inception, Facebook has attempted to be more "family-friendly" than other social media services. Its hardline stance on nudity, however, has often proved problematic, as its AI (and its human moderators) have flagged accounts for harmless images and/or failed to consider context when removing images or locking accounts.The latest example of Facebook's AI failing to properly moderate nudity involves garden vegetables. A seed business in Newfoundland, Canada was notified its image of onions had been removed for violating the terms of service. Its picture of onions apparently set off the auto-moderation, which flagged the image for containing "products with overtly sexual positioning." A follow-up message noted the picture of a handful of onions in a wicker basket was "sexually suggestive."

Facebook's nudity policy has been inconsistent since its inception. Male breasts are treated differently than female breasts, resulting in some questionable decisions by the platform. Its policy has also caused problems for definitively non-sexual content, like photos and other content posted by breastfeeding groups and breast cancer awareness videos. In this case, the round shape and flesh tones of the onions appear to have tricked the AI into thinking garden vegetables were overtly sexual content, showing the AI still has a lot to learn about human anatomy and sexual positioning.Decisions to be made by Facebook:
  • Should more automated nudity/sexual content decisions be backstopped by human moderators?
  • Is the possibility of over-blocking worth the reduction in labor costs?
  • Is over-blocking preferable to under-blocking when it comes to moderating content?
  • Is Facebook large enough to comfortably absorb any damage to its reputation or user goodwill when its moderation decisions affect content that doesn't actually violate its policies?
  • Is it even possible for a platform of Facebook's size to accurately moderate content and/or provide better options for challenging content removals?
Questions and policy implications to consider:
  • Is the handling of nudity in accordance with the United States' more historically Puritianical views really the best way to moderate content submitted by users all over the world?
  • Would it be more useful to users is content were hidden -- but not deleted -- when it appears to violate Facebook's terms of service, allowing posters and readers to access the content if they choose to after being notified of its potential violation?
  • Would a more transparent appeals process allow for quicker reversals of incorrect moderation decisions?
Resolution: The seed company's ad was reinstated shortly after Facebook moderators were informed of the mistake. A statement from the company raised at least one more question as its spokesperson did not clarify exactly what the AI thought the onions actually were, leaving users to speculate what the spokesperson meant, as well as how the AI would react to future posts it mistook for, "well, you know."
"We use automated technology to keep nudity off our apps," wrote Meg Sinclair, Facebook Canada's head of communications. "But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble."
Originally posted at the Trust & Safety Foundation website.

Read more here

posted at: 12:00am on 12-Dec-2020
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



December 2020
Sun Mon Tue Wed Thu Fri Sat
   
   







RSS (site)  RSS (path)

ATOM (site)  ATOM (path)

Categories
 - blog home

 - Announcements  (0)
 - Annoyances  (0)
 - Career_Advice  (0)
 - Domains  (0)
 - Downloads  (3)
 - Ecommerce  (0)
 - Fitness  (0)
 - Home_and_Garden  (0)
     - Cooking  (0)
     - Tools  (0)
 - Humor  (0)
 - Notices  (0)
 - Observations  (1)
 - Oddities  (2)
 - Online_Marketing  (0)
     - Affiliates  (1)
     - Merchants  (1)
 - Policy  (3743)
 - Programming  (0)
     - Bookmarklets  (1)
     - Browsers  (1)
     - DHTML  (0)
     - Javascript  (3)
     - PHP  (0)
     - PayPal  (1)
     - Perl  (37)
          - blosxom  (0)
     - Unidata_Universe  (22)
 - Random_Advice  (1)
 - Reading  (0)
     - Books  (0)
     - Ebooks  (0)
     - Magazines  (0)
     - Online_Articles  (5)
 - Resume_or_CV  (1)
 - Reviews  (2)
 - Rhode_Island_USA  (0)
     - Providence  (1)
 - Shop  (0)
 - Sports  (0)
     - Football  (0)
          - Cowboys  (0)
          - Patriots  (0)
     - Futbol  (0)
          - The_Rest  (0)
          - USA  (0)
 - Technology  (1049)
 - Windows  (1)
 - Woodworking  (0)


Archives
 -2024  March  (164)
 -2024  February  (168)
 -2024  January  (146)
 -2023  December  (140)
 -2023  November  (174)
 -2023  October  (156)
 -2023  September  (161)
 -2023  August  (49)
 -2023  July  (40)
 -2023  June  (44)
 -2023  May  (45)
 -2023  April  (45)
 -2023  March  (53)
 -2023  February  (40)


My Sites

 - Millennium3Publishing.com

 - SponsorWorks.net

 - ListBug.com

 - TextEx.net

 - FindAdsHere.com

 - VisitLater.com