e dot dot dot
a mostly about the Internet blog by

August 2021
Sun Mon Tue Wed Thu Fri Sat
       


Fake 'U.S. Copyright Office' Imposter Gets Google To Delist URLs On Section 1201 Grounds

Furnished content.


We've done more than our share of posts in the past about the problems within the DMCA takedown system as currently practiced. The reason for so many posts is in part due to the sheer number of problems with how this all works. For starters, when notices go out to search engines like Google to delist "problem" URLs, those notices are often times generated by automated systems that unsurprisingly result in a vast majority of notices targeting URLs that are non-infringing. As in, over 99% of those notices. And even once we get past the malpractice of using automation buckshot notices that result in an incredible amount of collateral damage, we then have to add the wide open avenues for fraud and abuse of the DMCA system. That type of fraud runs the gamut, from trolls merely trying to cause chaos for the fun of it to competitors of certain forms of content trying to hurt the competition. In the immortal words of former NFL coach John Fox: "It's all a problem."And, on the fraud and abuse side, it's such a problem that perfectly legit URLs can get delisted by Google due to a request from "The U.S. Copyright Office", even though that office doesn't make those sorts of requests.

Google has received several takedown notices that claim to come from the 'U.S. Copyright Office', requesting the search engine to remove 'problematic' URLs. The Government body, which is generally not involved in copyright enforcement, informs TorrentFreak that it has nothing to do with these notices. Unfortunately, Google didn't immediately spot the imposter.  The Copyright Office is not supposed to take sides in these matters. So, we were quite surprised to see its name on several takedown notices that were sent to Google over the past few days.The takedown requests are not typical ‘Section 512’ notices. Instead, they point out sites that circumvent technical protection measures, which is in violation of the DMCA’s ‘Section 1201.’ That’s also how Google processed them.
And process at least some of them, Google did. The notices claiming to be from the Copyright Office indicated they were sent on behalf of the Video Industry Association of America, which doesn't appear to exist based on a Google search I performed. Even if it does, the Copyright Office is not a party to these sorts of takedown requests on behalf of any organization. The URLs targeted appear to be mostly related to stream-ripping sites, but not just sites that offer that service. Instead, some of the URLs targeted merely mention sites that offer stream-ripping services, which is how several TorrentFreak posts got targeted.Whoever is doing this, it is most certainly not the Copyright Office.
This suspicion was confirmed by the U.S. Copyright Office. A spokesperson informs TorrentFreak that the notices in question were not submitted by them.This doesn’t mean that the takedown requests were ignored by Google. While our links are still indexed, several of the URLs listed in the notices have indeed been removed because of the notices, which is a problem.
It's a huge problem, actually. In fact, it demonstrates quite well how broken the current DMCA system has become. The fact that this sort of impersonation is so easy is an issue. The fact that Google is so inundated with these types of requests, which again are overwhelmingly illegitimate, that it cannot review them thoroughly enough to notice the clear impersonation of the Copyright Office at work here is another issue. And the fact that the DMCA process is obviously viewed by some bad actors as a wide open tool to attack their own competition is yet another issue.And, notably, there isn't even an appeal process for Section 1201 takedown requests.
Unfortunately, there is no counter-notification option for ‘Section 1201’ takedown notices. This means that sites and services that are affected by these bogus notices have no official appeal process they can use.But perhaps the U.S. Copyright Office can help with that?
Or maybe someone can just pretend to be the Copyright Office and help. You know, on its "behalf." It works for the bad actors, after all.

Read more here

posted at: 12:00am on 26-Aug-2021
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017)

Furnished content.


Summary: YouTube offers an endless stream of videos that cater to the preferences of users, no matter their age and has become a go-to content provider for kids and their parents. The market for kid-oriented videos remains wide-open, with new competitors surfacing daily and utilizing repetition, familiarity, and strings of keywords to get their videos in front of kids willing to spend hours clicking on whatever thumbnails pique their interest, and YouTube is leading this market.

Taking advantage of the low expectations of extremely youthful viewers, YouTube videos for kids are filled with low-effort, low-cost content - videos that use familiar songs, bright colors, and pop culture fixtures to attract and hold the attention of children.Most of this content is innocuous. But a much darker strain of content was exposed by amateur internet sleuths, which was swiftly dubbed "Elsagate," borrowing the name of the main character of Disney's massively popular animated hit, Frozen. At the r/ElsaGate subreddit, redditors tracked down videos aimed at children that contained adult themes, sexual activity, or other non-kid-friendly content.Among the decidedly not-safe-for-kids subject matter listed by r/ElsaGate are injections, gore, suicide, pregnancy, BDSM, assault, rape, murder, cannibalism, and use of alcohol. Most of these acts were performed by animated characters (or actors dressed as the characters), including the titular Elsa as well as Spiderman, Peppa Pig, Paw Patrol, and Mickey Mouse. According to parents, users, and members of the r/Elsagate subreddit, some of this content could be accessed via the YouTube Kids app — a kid-oriented version of YouTube subject to stricter controls and home to curated content meant to steer child users clear of adult subject matter.Further attention was drawn to the issue by James Bridle's post on the subject, entitled "Something is Wrong on the Internet." The post — preceded by numerous content warnings — detailed the considerable amount of disturbing content that was easily finding its way to youthful viewers, mainly thanks to its kid-friendly tags and innocuous thumbnails.The end result, according to Bridle, was nothing short of horrific:
“To expose children to this content is abuse. We’re not talking about the debatable but undoubtedly real effects of film or videogame violence on teenagers, or the effects of pornography or extreme images on young minds, which were alluded to in my opening description of my own teenage internet use. Those are important debates, but they’re not what is being discussed here. What we’re talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatise and disturb them, via networks which are extremely vulnerable to exactly this form of abuse. It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal.”  James Bridle
"Elsagate" received more mainstream coverage as well. A New York Times article on the subject wondered what had happened and suggested the videos had eluded YouTube's algorithms that were meant to ensure content that made its way to its Kids channel was actually appropriate for children. YouTube's response when asked for comment was that this content was the "extreme needle in the haystack," perhaps, an immeasurably small percentage of the total amount of content available on YouTube Kids. Needless to say, this answer did not make critics happy, and many suggested the online content giant rely less on automated moderation when dealing with content targeting kids.Company Considerations:
  • How should content review and moderation be different for content targeting younger YouTube users?
  • How could a verification process be deployed to vet users creating content for children?
  • What processes can be used to make it easier to find and remove/restrict content that appears to be kid-friendly but is actually filled with adult content?
  • When content like what was described in the case study does get through the moderation process, what can be done to restore the trust of users, especially those with younger children?
Issue Considerations:
  • Should a product targeting children be siloed off from the main product to ensure the integrity of the content, as well as make it easier to manage moderation issues?
  • Does creating a product specifically for children increase the chance of direct regulation or intervention by government entities? If so, how can a company prepare itself for this inevitability?
  • If creating a “restricted” product for children, should it require all content be fully and thoroughly vetted? If so, would that become prohibitively costly, making it significantly less likely that companies will create products for children? Is there a way to balance those things?
Resolution: Immediately following these reports, YouTube purged content from YouTube Kids that did not meet its standards. It delisted videos and issued new guidelines for contributors. It added a large number of new human moderators, bringing its total of moderators to 10,000. YouTube also removed the extremely popular "Toy Freaks" channel, which users had suggested contained child abuse, after investigating its content.YouTube wasn't the only entity to act after the worldwide exposure of "Elsagate" videos. Many of these videos originated in China, prompting the Chinese government to block certain search keywords to limit local access to the disturbing content, as well as shuttering at least one company involved in the creation of these videos.Originally posted to the Trust & Safety Foundation website.

Read more here

posted at: 12:00am on 26-Aug-2021
path: /Policy | permalink | edit (requires password)

0 comments, click here to add the first



August 2021
Sun Mon Tue Wed Thu Fri Sat
       







RSS (site)  RSS (path)

ATOM (site)  ATOM (path)

Categories
 - blog home

 - Announcements  (0)
 - Annoyances  (0)
 - Career_Advice  (0)
 - Domains  (0)
 - Downloads  (3)
 - Ecommerce  (0)
 - Fitness  (0)
 - Home_and_Garden  (0)
     - Cooking  (0)
     - Tools  (0)
 - Humor  (0)
 - Notices  (0)
 - Observations  (1)
 - Oddities  (2)
 - Online_Marketing  (0)
     - Affiliates  (1)
     - Merchants  (1)
 - Policy  (3743)
 - Programming  (0)
     - Bookmarklets  (1)
     - Browsers  (1)
     - DHTML  (0)
     - Javascript  (3)
     - PHP  (0)
     - PayPal  (1)
     - Perl  (37)
          - blosxom  (0)
     - Unidata_Universe  (22)
 - Random_Advice  (1)
 - Reading  (0)
     - Books  (0)
     - Ebooks  (0)
     - Magazines  (0)
     - Online_Articles  (5)
 - Resume_or_CV  (1)
 - Reviews  (2)
 - Rhode_Island_USA  (0)
     - Providence  (1)
 - Shop  (0)
 - Sports  (0)
     - Football  (0)
          - Cowboys  (0)
          - Patriots  (0)
     - Futbol  (0)
          - The_Rest  (0)
          - USA  (0)
 - Technology  (1192)
 - Windows  (1)
 - Woodworking  (0)


Archives
 -2024  April  (128)
 -2024  March  (179)
 -2024  February  (168)
 -2024  January  (146)
 -2023  December  (140)
 -2023  November  (174)
 -2023  October  (156)
 -2023  September  (161)
 -2023  August  (49)
 -2023  July  (40)
 -2023  June  (44)
 -2023  May  (45)
 -2023  April  (45)
 -2023  March  (53)


My Sites

 - Millennium3Publishing.com

 - SponsorWorks.net

 - ListBug.com

 - TextEx.net

 - FindAdsHere.com

 - VisitLater.com