Content Moderation Case Study: Dealing With Demands From Foreign Governments (January 2016)
Summary: US companies obviously need to obey US laws, but dealing with demands from foreign governments can present challenging dilemmas. The Sarawak Report, a London-based investigative journalism operation that reports on issues and corruption in Malaysia, was banned by the Malaysian government in the summer of 2015. The publication chose to republish its own articles on the US-based Medium.com website (beyond its own website) in an effort to get around the Malaysian ban.In January of 2016, the Sarawak Report had an article about Najib Razak, then prime minister of Malaysia, entitled: Najib Negotiates His Exit BUT He Wants Safe Passage AND All The Money! related to allegations of corruption that were first published in the Wall Street Journal, regarding money flows from the state owned 1MDB investment firm.The Malaysian government sent Medium a letter demanding that the article be taken down. The letter claimed that the article contained false information and that it violated Section 233 of the Communications and Multimedia Act, a 1998 law that prohibits the sharing of offensive and menacing content. In response, Medium requested further evidence of what was false in the article.Rather than responding to Medium's request for the full content assessment from the Malaysian Communications and Multimedia Commission (MCMC), the MCMC instructed all Malaysian ISPs to block all of Medium throughout Malaysia.Decisions to be made by Medium:
Questions and policy implications to consider:
- How do you handle demands from foreign governments to take down content?
- Does it matter which government? If so, how do you determine which governments to trust?
- How do you determine the accuracy of claims from a foreign government regarding things like false reporting?
- What are the trade-offs of being blocked entirely by a country?
Resolution: The entire Medium.com domain remained blocked in Malaysia for over two years. In May of 2018, Najib Razak was replaced as Prime Minister by Mahathir Mohamad (who had been Prime Minister from 1981 to 2003). However, in 2018, he was representing the Pakatan Harapan coalition, which was the first opposition party to the Barisan Nasional coalition to win a Malaysian election since Malaysian independence (Mahathir Mohamad had previously ruled as part of the Barisan Nasional). Part of Pakatan Harapan's platform was to allow for more press freedom.Later that month, people noticed that Medium.com was no longer blocked in Malaysia. Soon after, the MCMC put out a statement saying that Medium no longer needed to be blocked because an audit of 1MDB had been declassified days earlier, and once that report was out, there no longer was a need to block the website: In the case of Sarawak Report and Medium, there is no need to restrict when the 1MDB report has been made public.Originally published on the Trust & Safety Foundation website.
- Taking down content that turns out to be credible accusations of corruption can serve to support that corruption and censor important reporting. Yet, leaving up information that turns out to be false can lead to political unrest. How should a website weigh those two sides?
- Should it be the responsibility of websites to investigate who is correct in these scenarios?
- What is the wider impact of an entire website for user generated content being blocked in a country like Malaysia?
Read more here
posted at: 12:00am on 16-Jan-2021
path: /Policy | permalink | edit (requires password)
A Few More Thoughts On The Total Deplatforming Of Parler & Infrastructure Content Moderation
I've delayed writing deeper thoughts on the total deplatforming of Parler, in part because there was so much else happening (including some more timely posts about Parler's lawsuit regarding it), but more importantly because for years I've been calling for people to think more deeply about content moderation at the infrastructure layer, rather than at the edge. Because those issues are much more complicated than the usual content moderation debates.And once again I'm going to make the mistake of offering a nuanced argument on the internet. I urge you to read through this entire post, resist any kneejerk responses, and consider the larger issues. In fact, when I started to write this post, I thought it was going to argue that the moves against Parler, while legal, were actually a mistake and something to be concerned about. But as I explored the arguments, I simply couldn't justify any of them. Upon inspection, they all fell apart. And so I think I'll return to my initial stance that the companies are free to make decisions here. There should be concern, however, when regulators and policymakers start talking about content moderation at the infrastructure layer.The "too long, didn't read" version of this argument (and again, please try to understand the nuance) is that even though Parler is currently down, it's not due to a single company having total control over the market. There are alternatives. And while it appears that Parler is having difficulty finding any such alternative to work with it, that's the nature of a free market. If you are so toxic that companies don't want to do business with you, that's on you. Not them.It is possible to feel somewhat conflicted over this. I initially felt uncomfortable with Amazon removing Parler from AWS hosting, effectively shutting down the service, and with Apple removing its app from the app store, effectively barring it from iPhones. In both cases, those seemed like very big guns that weren't narrowly targeted. I was less concerned about Google's similar removal, because that didn't block Parler from Android phones, since you don't have to go through Google to get on an Android phone. But (and this is important) I think all three moves are clearly legal and reasonable steps for the companies to take. As I explored each issue, I kept coming back to a simple point: the problems Parler is currently facing are due to its own actions and the unwillingness of companies to associate with an operation so toxic. That's the free market.If Parler's situation was caused by government pressure or because there were no other options for the company, then I would be a lot more concerned. But that does not appear to be the case.The internet infrastructure stack is represented in different ways, and there's no one definitive model. But an easy way to think of it is that there are "edge" providers -- the websites you interact with directly -- and then there's everything beneath them: the Content Delivery Networks (CDNs) that help route traffic, the hosting companies/data centers/cloud providers that host the actual content, the broadband/network/access providers, and the domain registers and registrars that help handle the naming and routing setup. And there are lots of other players in there as well, some (like advertising and certain communications providers) with elements on the edge and elements deeper in the stack.But a key thing to understand is the level of granularity with which different players can moderate, and the overall impact their moderation can have. It's one thing for Twitter to remove a tweet. It's another thing for Comcast to say "you can't access the internet at all." The consequences of moderation get much more severe the deeper you go into the stack. In this case, AWS's only real option for Parler was to remove the entire service, because it couldn't just target the problematic content (of which there was quite a lot). As for the app stores, it's a tricky question. Are app stores infrastructure, or edge? Perhaps they are a little of both, but they had the same limited options: remove the app entirely, or leave it up with all its content intact.For many years, we've talked about the risks of saying that players deeper in the infrastructure stack should be responsible for content moderation. I was concerned, back in 2014, when there was talk of putting liability on domain registrars if domains they had registered were used for websites that broke the law. There have been a few efforts to hold such players responsible as if they were the actual lawbreakers, and that obviously creates all sorts of problems, especially at the 1st Amendment level. As you move deeper into the stack, the moderation options look less like scalpels and more like sledgehammers that remove entire websites from existence.Almost exactly a decade ago, in a situation that has some parallels to what's happened now, I highlighted concerns about Amazon deciding to deplatform Wikileaks in response to angry demands from then Senator Joe Lieberman. I found that to be highly problematic, and likely unconstitutional -- though Wikileaks, without a US presence, had little standing to challenge it at the time. My concern was less with Amazon's decision, and more with Lieberman's pressure.But it's important to go back to first principles in thinking through these issues. It's quite clear that companies like Amazon, Apple, and Google have every legal right to remove services they don't want to associate with, and there are a ton of reasons why people and companies might not want to associate with Parler. But many people are concerned about the takedowns based on the idea that Parler might be "totally" deplatformed, and that one company saying "we don't want you here" could leave them with no other options. That's not so much a content moderation question, as a competition one.If it's a competition question, then I don't see why Amazon's decision is really a problem either. AWS only has 32% marketshare. There are many other options out there -- including the Trump-friendly cloud services of Oracle, which promotes how easy it is to switch from AWS on its own website. Oracle's cloud already hosts Zoom (and now TikTok's US services). There's no reason they can't also host Parler.*But, at least according to Parler, it has been having trouble finding an alternative that will host it. And on that front it's difficult to feel sympathy. Any business has to build relationships with other businesses to survive, and if no other businesses want to work with you, you might go out of business. Landlords might not want to rent to troublesome tenants. Fashion houses might choose not to buy from factories with exploitative labor practices. Businesses police each other's business practices all the time, and if you're so toxic that no one wants to touch you... at some point, maybe that's on you, Parler.The situation with Apple and Google is slightly different, and again, there are lots of nuances to consider. With Apple, obviously, it is controlling access to its own hardware, the iPhone. And there's a reasonable argument to be made that Apple offers the complete package, and part of that deal is that you can only add apps through its app store. Apple has long argued that it does this to keep the phone secure, though it could raise some anti-competitive concerns as well. But Apple has banned plenty of apps in the past (including Parler competitor Gab). And that's part of the nature of iPhone ownership. And, really, there is a way to route around Apple's app store: you can still create web apps that will work on iOS without going through the store. This does limit functionality and the ability to reach deeper into the iPhone for certain features, but those are the tradeoffs.With Google, it seems like there should be even less concern. Not only could Parler work as a web app, Google does allow you to sideload apps without using the Google Play store. So the limitation was simply that Google didn't want the app in its own store. Indeed, before Amazon took all of Parler down, the company was promoting its own APK to sideload on Android phones.In the end, it's tough to argue that this is as worrisome as my initial gut reaction said. I am still concerned about content moderation when it reaches the infrastructure layer. I am quite concerned that people aren't thinking through the kind of governance questions raised by these sledgehammer-not-scalpel decisions. But when exploring each of the issues as it relates to Parler specifically, it's hard to find anything to be that directly concerned about. There are, mostly, alternatives available for Parler. And in the one area that there apparently aren't (cloud hosting) it seems to be less because AWS has market power, and more because lots of companies just don't want to associate with Parler.And that is basically the free market telling Parler to get its act together.* It's noteworthy that AWS customers can easily migrate to Oracle Cloud only because Oracle copied AWS's API without permission which, according to its own lawyers is copyright infringement. Never expect Oracle to not be hypocritical.
Read more here
posted at: 12:00am on 16-Jan-2021
path: /Policy | permalink | edit (requires password)