Valve Clears Up Nothing With Its Latest Explanation Of What Games It Will Ban As 'Troll Games'
You will recall that several months back, Valve released a statement outlining what it considered to be sweeping changes to its game curation duties. While the company made a great deal of forthcoming tools on the Steam store for filtering game searches, pretty much everyone focused on the platform's claim that it would no longer keep any game off its platform unless it was "illegal or a troll game." That, of course, still left all kinds of ambiguity as to what is and is not allowed on the platform and it provided a wide avenue through which Steam could still drive its oversight truck. This led to our having a podcast discussion in which I pointed out repeatedly that this was every bit as opaque a policy as the one that proceeded it, which was followed by the real-world example of developers across the spectrum pointing out that they in fact had no idea what the policy actually meant. In other words, the whole thing has generally been an unproductive mess.A mess which Valve tried to clean up this past week in an extensive blog post on its site which attempted to define what it meant by "troll games." As the folks at Ars point out, this attempt at clarity is anything but. Much of what Valve lays out as "troll games" makes sense: scam games that work Steam's inventory system, or try to manipulate developer Steam keys, or games that are simply broken due to a lack of seriousness on the part of the developer. But then it also said the definition included what most people thought of in the original announcement: games that "just try to incite and sow discord."
Valve's Doug Lombardi said at the time that Active Shooter was removed from Steam because it was "designed to do nothing but generate outrage and cause conflict through its existence." That designation came despite the fact that the developer said the game was "a dynamic SWAT simulator in which dynamic roles are offered to players" and that he would "likely remove the shooter's role in the game by the release" after popular backlash to the idea.As the developer noted at the time, too, "there are games like Hatred, Postal, Carmageddon and etc., which are even [worse] compared to Active Shooter and literally focuses on mass shootings/killings of people."It's as good an example as any for pointing out what has always been true about art forms: one person's inflammatory content is another person's artistic genius. More worrisome, Valve's own words on its policy put the company squarely in the business of mind-reading, with its post suggesting that troll developers are those that aren't actually interested in making or selling a game. It relies on Valve's own analysis of a developer's "good faith" in putting forth the game.
While good-faith developer efforts can obviously lead to "crude or lower quality games" on Steam, Valve says that "it really does seem like bad games are made by bad people." And it's those bad games from bad people that Valve doesn't want on Steam.Absent a mind-reading device, determining a developer's motives isn't an easy task. Defining what separates a good faith effort to sell a game from a "troll" involves a "deep assessment" of the developer, Valve says, including a look at "what they've done in the past, their behavior on Steam as a developer, as a customer, their banking information, developers they associate with, and more."We could spend a great deal of time discussing how qualified Valve is in making these determinations, or what value such curation provides for a platform like Steam. Or we could talk instead about whether this treatment sets video games back a notch or two as an art form, with corporate oversight playing the role of evaluating each artist's intent.But the real lesson here is that, whatever you think of Valve's definitions above, it is clear as day that these explanations are not in line with the overall message in Valve's original notice of the change in policy. The company explicitly said at that time that it didn't believe it should be in the business of deciding what types of games with what types of content users should see on the platform. The whole point of this was for wide inclusion, whereas it seems really hard to see any daylight from this updated explanation and Steam's historical curation policy. Valve still gets to decide what goes on the platform.So many words and so much time for so little effect, in other words.
Permalink | Comments | Email This Story
Read more here
posted at: 12:35am on 12-Sep-2018
path: /Policy | permalink | edit (requires password)
The Intellectual Dishonesty Of Those Supporting The Existing Text Of The EU Copyright Directive
As the EU gets ready to vote (again) on various amendments for the EU Copyright Directive, there has been an incredibly dishonest push by supporters of the original directive (often incorrectly claiming they're thinking of creators' best interests), to argue that the warnings of those who think these proposals are dangerous are misleading. What they are doing is unfortunate, but it deserves to be called out -- because of just how dishonest it is. They usually involve misrepresenting the law and its impact in order to completely misrepresent what will happen.There are numerous examples of this in practice, but I'll use this article in the German site FAZ as just one example of the kind of rhetoric being used, as it is an impressively intellectually bankrupt version of the argument I'm seeing quite a bit lately. It was written by a guy named Volker Rieck who has shown up in a bunch of places attacking critics of the EU Copyright Directive. He apparently runs some sort of anti-piracy organization, which perhaps shouldn't be surprising. But, that doesn't excuse the sheer dishonesty of his arguments.
Very early in the process, the only MEP from the Pirate Party, Julia Reda, began to fight the propositions. For her campaign, she made very strong use of distortion and simplification. The word "link tax"..., by way of which Reda wanted to stop Article 11 of the policy, may be catchy, but there is something unwittingly comical to the earnest suggestion that there is a tax, collected by the tax office, on using links to online pieces of writing.This is... odd. The word "tax" is used in a variety of contexts to show excess costs of certain proposals. Nothing about it deliberately suggests a "tax office" will be involved. But the "link tax" is quite real. The whole point of Article 11 is to create a new form of license -- required for certain sites to have to pay for nearly every use of media content. Let's be clear, because it often gets lost in the discussion: all of this content is already covered by copyright. At issue is whether or not one can link to it and include a short summary of the contents without first having to pay a license above and beyond what they would have to pay to license the content itself. And this is not an ambiguous issue. In the latest draft of the proposal from MEP Axel Voss it's pretty explicit that the link tax is about "obtaining fair and proportionate remuneration for such uses." The following is directly from the text of Voss's proposed amendment (which is more or less the "default" plan for the Copyright Directive as he's the main MEP behind the Directive):
Online content sharing service providersperform an act of communication to the publicand therefore are responsible for theircontent and should therefore conclude fairand appropriate licensing agreements withrightholders.It is absolutely a tax to require a license for such uses. And while Voss has included this escape clause saying that this "does notextend to acts of hyperlinking with respect to press publications" it is left entirely vague as to how to distinguish when a link with some basic link text is allowed without a license and when it needs to be licensed. Indeed, Voss's only real limitation is that the rules "shall not extend to mere hyperlinks, which are accompanied by individual words." Individual words. What goes beyond "individual"? Considering that individual means "single" or "one," it seems clear that under Voss's definition, accompanying a link with two words, may now subject you to requiring a license to link. This is even worse than the awful German law, which only required licenses on something beyond "short" phrases (where even that was not clearly defined).Back to the awful FAZ piece:
The polemical buzzword "upload filter", to oppose Article 13 of the policy, wasn't much better. Upload filters are not, and were never, part of the proposal, but the word works well in fueling fears. Indeed, Julia Reda managed to convince some of her supporters that if the policy on copyright law is passed, everything on the internet will be filtered, and memes - yes, those beloved memes - will be forbidden altogether.The fact that the policy says something completely different was of no more than marginal interest. According to the actual proposal, web platforms - and only web platforms - would have been obliged to enter into license agreements with the individual right owners of user-uploaded content or the copyright collectives by which the content is maintained.This is particularly galling in just how dishonest it is. Saying that this won't impact users, but merely platforms, is bullshit. How do most users communicate these days? On platforms. And saying that platforms then have to license all content, as if the "cost" of that is not then passed along to the users. And that "cost" isn't just in monetary terms. It will, undoubtedly be in terms of perfectly non-infringing works completely taken offline, either because of accidental identification or malicious takedown efforts.Sure, some people could try to post content on their own sites, but how long will it take until those who support Article 13 move down the stack and argue that hosting companies who allow users to host their own websites are in the same classification as the platforms who are required to obtain licenses under the law?It gets worse:
In this scenario, it's the platforms who are responsible for license payments; users have nothing to do with it.I mean, come on. The platforms are the arbiters of end-users speech in this case. Of course users have everything to do with it. If it's too costly, the platforms will default to blocking the content, rather than allowing it to happen. And, again, any costs will be passed on from the platforms to the users in some form or another.
It would simply have meant a duty for the platforms to be transparent in order to comprehensively account for the licensing and to correctly forward the payments to the respective right owners. If a platform didn't want to enter such a license agreement, the EU policy would at least hold that platform responsible to keep its own website clean. How it achieves that is up to the platform itself, as long as it prevents copyright infringements.This is also particularly dishonest. If a platform doesn't want to enter into such a license... they would be responsible for keeping their website clean. And how would they possibly do that? They'd be required to pay for an incredibly expensive (and ineffective) upload filter. So to claim that this isn't a proposal for upload filters is utter nonsense.Also, the whole "it's up to the platform, as long as it prevents copyright infringement" is fantasy land thinking, as if there's some solution that magically stops all copyright infringement. Whoever wrote this is incredibly dishonest or ignorant of how the world works. There is no solution that prevents all copyright infringement -- other than not existing at all.
Unfortunately, though, many of those who have joined the discussion have refused to put in the intellectual effort to read the proposal in its updated form and understand its intention. This goes for everyone all the way from web associations of political parties to journalist Sascha Lobo, who wrote of "censorship machines"... in "der Spiegel". If only they had read what they publicly decry! Then maybe they would have realised that for the first time, users of platforms that don't license content would have had substantial leverage, including a right to mediation in the case of the blocking of content. At that point, at the latest, it should have become clear that the term "censorship" misses the mark. Perhaps it was simply too complicated to get hold of and understand the current version of the document?Leverage? What leverage? If the law requires you not to allow any infringement, you have no leverage at all. Second, the concern about censorship is not at all made up. We know it's real because we see it happen all the time under existing notice-and-takedown regimes, which are significantly less extreme and less draconian than what's required under Article 13. The censorship comes from platforms seeking to avoid significant liability (and costly trials). They are incentivized (heavily) into taking down content to avoid the risk and liability. And thus, they will take down lots and lots of content rather than risk it -- especially when held to ridiculous standards like preventing all infringement from appearing on their platforms.The dishonesty continues:
But let's talk about the platforms, since they are the ones affected by this. More specifically, let's talk about one of the most successful platforms: Youtube. It's exclusively platforms like Youtube that the policy addresses. Not start-ups, not online shops, and not open source platforms.This is blatantly untrue. As we noted back in July, those behind the EU Copyright Directive explicitly said the opposite. Here's what they said:
Any platform is covered by Article 13 if one of their main purposes is to give access to copyright protected content to the public.It cannot make any difference if it is a small thief or a big thief as it should be illegal in the first place.Small platforms, even a one-person business, can cause as much damage to right holders as big companies, if their content is spread (first on this platform and possibly within seconds throughout the whole internet) without their consent.That's from the Committee who voted on the Directive. So to say it only targets platforms like YouTube when the crafters of the law itself say that it applies to small platforms and even one-person businesses, shows just how dishonest supporters are concerning all of this. Separately, it's obvious that it doesn't just apply to YouTube because YouTube already complies with Article 13 via things like ContentID. To argue that the law is targeting them is ridiculous. Why write an entire new law to just say "that thing you're already doing, yeah, keep that up." The author of the FAZ piece then goes on to talk all sorts of nonsense about Content ID.
For years, Youtube has used a system called Content ID, which allows right owners who have uploaded their content to the platform to decide what happens to it if and when it's used. This ranges from monetarisation - if, for instance, a user uploads a video which includes music, the right owner of that music receives a portion of the video's ad revenue - to the blocking of the video. Above all else, it's meant to prevent third parties from making money using other people's content.But it gets better still. A system called Copyright Match, which Youtube developed for its channel owners, is just now ready to be put into practice. It is, as it were, a "Content ID" light, and is mainly intended to assist Youtubers in reacting to identical videos. The user who uploaded the video first automatically receives a message and gets to decide what happens to the duplicate, including the possibility to block it.Is there anybody out there who'd brand this "censorship"? Apparently not - after all, there have been no demonstrations against Content ID and Copyright Match. We haven't seen public outrage against Youtube's "censorship machine".Anyone claiming that there hasn't been outrage over ContentID taking down all sorts of legitimate content simply has no legitimate argument for being part of this debate. There has been massive and sustained outrage over ContentID and how it takes down all sorts of legitimate content. We've had probably over a dozen posts on Techdirt alone of bogus takedowns via ContentID, and people have been highlighting the problems of ContentID leading to inappropriate censorship going back nearly a decade.If someone is going to insist that (1) Article 13 only targets platforms like YouTube, even when the authors of the law insist that's not true, and (2) state that no one complains about ContentID takedowns, they have no business arguing that the attacks on the EU Copyright Directive are untruthful. They are ignorant or lying. Neither is a good look.The rest of the article is out-and-out conspiracy theory talking, including (I kid you not) accusations of George Soros' involvement in fighting against the Copyright Directive. And yet, amazingly, some people are taking this shit seriously. It is not serious. It is blatantly dishonest and should be treated as such.
Permalink | Comments | Email This Story
Read more here
posted at: 12:35am on 12-Sep-2018
path: /Policy | permalink | edit (requires password)