As the fallout from last week’s attack on the Capitol continues to be front page news, big questions surround big tech’s role as the arbiter of acceptable online speech.
After Facebook suspended President Trump’s account indefinitely and Twitter shut him down permanently, YouTube announced Wednesday that it will be freezing the president’s account for a week, citing concerns over the ongoing potential for violence.
Apple, Google, and Amazon have also pulled the plug on Parler, a social network that has become increasingly popular in recent months with conservatives, with a reputation for allowing content that would not otherwise be tolerated on other channels, including numerous calls for violence. Parler has responded by filing a lawsuit against Amazon, including claims that Amazon Web Services (AWS) violated antitrust laws and is in breach of contract for not providing a 30-day notice of cancellation.
In the 18-page complaint, filed in the U.S. District Court for the Western District of Washington, Parler argues that the decision to suspend its account “is apparently motivated by political animus” and designed to “reduce competition in the microblogging services market to the benefit of Twitter,” which recently signed a long-term deal with AWS and stands as one of Parler’s main competitors. The suit includes claims for breach of contract, tortious interference, and violation of antitrust law, alleging that Amazon failed to take similar actions in suspending Twitter’s account that included similar rhetoric. Parler is seeking a temporary restraining order to prevent Amazon from removing the social platform from its servers and prevent what it says will be irreparable harm to its business.
Can Amazon really do that? What about the First Amendment?
The suit also comes as tensions over alleged First Amendment violations remain high. It’s well established that the First Amendment limits the government’s ability to restrict people’s speech, not private businesses’ ability to do so. Stated differently, the First Amendment only applies to public places, not private spaces, such as a social media platform. But not so fast – in 1980, the Supreme Court in Pruneyard v. Shopping Center v. Robins held that a shopping mall owner could not exclude a group of high school students who were engaged in political advocacy in quasi-public spaces in a private shopping mall. The Court accepted the argument that it was within California’s power to guarantee this expansive free speech right since it did not unreasonably intrude on the rights of private property owners. Likewise, in 2017, the Supreme Court in Packingham v. North Carolina held that the First Amendment prohibited the government from banning sex offenders from social media websites, finding implicitly social media to be a public space. The question, then, of whether Twitter and other social media spaces, and their associated cloud servers, where people congregate are “public” and deserving of First Amendment protections is not clear-cut.
For its part, Amazon claims it was well within its rights to dismiss Parler after it failed to promptly identify and remove content encouraging or inciting violence against others, a direct violation of Amazon’s terms of service. According to court documents, Amazon says it reported more than a hundred examples of such violative content to Parler in just the past several weeks. In its official response to Parler’s restraining order request, AWS states that this “case is not about suppressing speech or stifling viewpoints. It is not about a conspiracy to restrain trade. Instead, this case is about Parler’s demonstrated unwillingness and inability to remove from the servers of Amazon Web Services (‘AWS’) content that threatens public safety.”
Most experts see Amazon’s decision to remove Parler as legitimate, and the microblogger will have a steep climb arguing against what are clear violations of terms. It’s also not without precedent: Cloudflare, a small company that provides tools to help websites protect against cyber attacks and load content more quickly, made a similar decision after facing pressure to drop The Daily Stormer, a neo-Nazi hate site, from their service after the deadly riots in Charlottesville in 2017. It later dropped 8Chan, a controversial forum linked to several deadly attacks, including those in El Paso, Texas and Christchurch, New Zealand.
What does this means for businesses, consumers and the future of social media?
While this case was born out of a national crisis, there is little incentive and less legal standing for businesses to start an online political witch hunt. As Amazon stated in their response to Parler, “AWS has no incentive to stop doing business with paying customers that comply with its agreements.”
But while Amazon and others are arguably on solid legal ground in their choice to drop Parler or block the president, these decisions bring up much larger questions about how we ended up with a few huge companies holding immense power over the trajectory of public discourse.
In many ways, the Constitution and our legal frameworks have not caught up to the pace, scope, and influence of online and social media. There’s not a lot of legal guidance on how tech companies or third-party vendors should treat illegal or inflammatory content posted on their networks or produced with their tools. Lawmakers are also grappling with how much responsibility should fall on social behemoths, like Facebook, that produce and house immense amounts of online content, but are not treated like traditional publishers under the law.
This is certainly both a landmark moment and a moment of reckoning for digital media consumers and providers. It’s too soon to tell how this will push transformation in the tech world and the digital town square of social media, but we’ll be following the conversation closely.
*Attorney Advertising. Prior results do not guarantee future outcomes.