FacebookRegulating Online Content – The Balance Between Free Speech and Free-For-All

Regulating Online Content – The Balance Between Free Speech and Free-For-All

This year kicked off with an explosive culmination to the ongoing tensions between free speech and social media, with Twitter bans, lawsuits and enduring questions about who gets to regulate content on the internet—or if it should be regulated at all. America is distinctly uncomfortable with the government stepping in to regulate speech. But public pressure has forced Big Tech to fill the void, spurring claims of unfair treatment and violations of First Amendment rights.

At the heart of the matter: unlike other countries that have laws against hate speech and fake news, America seems to have left it up to private companies to decide what content is acceptable with little legal obligation to explain their choices. Compounding the problem is what some argue the enormous power that a few big tech companies wield over our online infrastructure and channels of communication, leaving some to wonder if service providers like Facebook should really be treated more like a utility, with the government regulations to match. 

Are there restrictions or laws regulating online content?

In Reno v. American Civil Liberties Union, the U.S. Supreme Court declared speech on the Internet equally worthy of the First Amendment’s historical protections. That means pornography, violent films, explicit racism are all fair game on social media in the eyes of the law. The government only deems very narrow categories of speech as criminal, such as “true threats,’ or language that is explicitly intended to make an individual or group fear for their life or safety. Although it’s interesting to note that arguing a politician should be shot wouldn’t necessarily meet the criteria for incitement or true threat.

As of late, America has held tightly to an interpretation of the First Amendment that protects the free marketplace of ideas, even when it comes at a cost. Landmark cases like Brandenburg v. Ohio that protected the speech of a Ku Klux Klan leader, have solidified our particularly high bar for punishing inflammatory speech.  

But America has also supported the rights of private companies to decide what kind of speech is appropriate in their venues and by extension, virtual squares. Unlike most of the world, where ISPs are subject to state mandates, content regulation in the United States mostly occurs at the private or voluntary level. Social media companies are allowed to decide what their user policies are and are expected to self-regulate, creating internal speech policies that, in theory, protect against unfair censorship.

Beyond the social media companies themselves, the regulators and legal recourse that do exist present their own set of problems. ICANN, the non-profit that controls contracts with internet registries (.com, .org, .info, etc.) and registrars (companies that sell domain names), has immense power over who gets to claim a domain name—ICANN decisions are not subject to speech claims based on the First Amendment. The Digital Millennium Copyright Act (DMCA), designed to offer anti-piracy protections, is often used as a tool of intimidation or as a means for companies to keep a tight control on how consumers use their copyrighted works, stifling free speech in the process. Apple, for example, tried to use the DMCA in 2009 to shut down members in the online forum BluWiki who were discussing how to sync music playlists between iPods and iPhones without having to use iTunes. John Deere refuses to unlock its proprietary tractor software to let farm owners repair their own vehicles, leaving tractor owners in fear of DMCA lawsuits if they try to crack the software protections themselves.

The Growing Pressure to Regulate Content

In the absence of legal pressure, public opinion seems to be the real driver of online content regulation. It was a tipping point of public outrage that finally pushed big tech to ban the president and Parler. Apple pulled Tumblr from the App Store in 2018 because it was failing to screen out child sex abuse material, but only after multiple public complaints. After decades of proudly promoting free speech, regardless of the consequences, external pressures are now forcing companies like Facebook to police their domains, using legions of reviewers to flag harmful content.

While the world grapples with how to manage online speech, it’s clear that businesses will continue to face a variety of legal, social, and moral pressures regarding the content they provide or facilitate—and they must be prepared to monitor and account for what goes on in their virtual public spaces. Companies that allow for the posting of content – words, photos, videos – have a slew of laws to consider in allowing this practice, including free speech rights and controls. Companies should work with sophisticated and experienced tech legal counsel, like Beckage, to address these issues.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

Parler v. AWSParler v. Amazon Web Services – The Ongoing Conversation Surrounding Social Media, Big Tech, and Freedom of Speech

Parler v. Amazon Web Services – The Ongoing Conversation Surrounding Social Media, Big Tech, and Freedom of Speech

As the fallout from last week’s attack on the Capitol continues to be front page news, big questions surround big tech’s role as the arbiter of acceptable online speech.

After Facebook suspended President Trump’s account indefinitely and Twitter shut him down permanently, YouTube announced Wednesday that it will be freezing the president’s account for a week, citing concerns over the ongoing potential for violence.

Apple, Google, and Amazon have also pulled the plug on Parler, a social network that has become increasingly popular in recent months with conservatives, with a reputation for allowing content that would not otherwise be tolerated on other channels, including numerous calls for violence. Parler has responded by filing a lawsuit against Amazon, including claims that Amazon Web Services (AWS) violated antitrust laws and is in breach of contract for not providing a 30-day notice of cancellation.

In the 18-page complaint, filed in the U.S. District Court for the Western District of Washington, Parler argues that the decision to suspend its account “is apparently motivated by political animus” and designed to “reduce competition in the microblogging services market to the benefit of Twitter,” which recently signed a long-term deal with AWS and stands as one of Parler’s main competitors. The suit includes claims for breach of contract, tortious interference, and violation of antitrust law, alleging that Amazon failed to take similar actions in suspending Twitter’s account that included similar rhetoric. Parler is seeking a temporary restraining order to prevent Amazon from removing the social platform from its servers and prevent what it says will be irreparable harm to its business.

Can Amazon really do that? What about the First Amendment?

The suit also comes as tensions over alleged First Amendment violations remain high.  It’s well established that the First Amendment limits the government’s ability to restrict people’s speech, not private businesses’ ability to do so. Stated differently, the First Amendment only applies to public places, not private spaces, such as a social media platform.  But not so fast –  in 1980, the Supreme Court in Pruneyard v. Shopping Center v. Robins held that a shopping mall owner could not exclude a group of high school students who were engaged in political advocacy in quasi-public spaces in a private shopping mall. The Court accepted the argument that it was within California’s power to guarantee this expansive free speech right since it did not unreasonably intrude on the rights of private property owners. Likewise, in 2017, the Supreme Court in Packingham v. North Carolina held that the First Amendment prohibited the government from banning sex offenders from social media websites, finding implicitly social media to be a public space. The question, then, of whether Twitter and other social media spaces, and their associated cloud servers, where people congregate are “public” and deserving of First Amendment protections is not clear-cut. 

For its part, Amazon claims it was well within its rights to dismiss Parler after it failed to promptly identify and remove content encouraging or inciting violence against others, a direct violation of Amazon’s terms of service. According to court documents, Amazon says it reported more than a hundred examples of such violative content to Parler in just the past several weeks. In its official response to Parler’s restraining order request, AWS states that this “case is not about suppressing speech or stifling viewpoints. It is not about a conspiracy to restrain trade. Instead, this case is about Parler’s demonstrated unwillingness and inability to remove from the servers of Amazon Web Services (‘AWS’) content that threatens public safety.”

Most experts see Amazon’s decision to remove Parler as legitimate, and the microblogger will have a steep climb arguing against what are clear violations of terms. It’s also not without precedent: Cloudflare, a small company that provides tools to help websites protect against cyber attacks and load content more quickly, made a similar decision after facing pressure to drop The Daily Stormer, a neo-Nazi hate site, from their service after the deadly riots in Charlottesville in 2017. It later dropped 8Chan, a controversial forum linked to several deadly attacks, including those in El Paso, Texas and Christchurch, New Zealand.

What does this means for businesses, consumers and the future of social media?

While this case was born out of a national crisis, there is little incentive and less legal standing for businesses to start an online political witch hunt. As Amazon stated in their response to Parler, “AWS has no incentive to stop doing business with paying customers that comply with its agreements.”

But while Amazon and others are arguably on solid legal ground in their choice to drop Parler or block the president, these decisions bring up much larger questions about how we ended up with a few huge companies holding immense power over the trajectory of public discourse.

In many ways, the Constitution and our legal frameworks have not caught up to the pace, scope, and influence of online and social media. There’s not a lot of legal guidance on how tech companies or third-party vendors should treat illegal or inflammatory content posted on their networks or produced with their tools. Lawmakers are also grappling with how much responsibility should fall on social behemoths, like Facebook, that produce and house immense amounts of online content, but are not treated like traditional publishers under the law.

This is certainly both a landmark moment and a moment of reckoning for digital media consumers and providers. It’s too soon to tell how this will push transformation in the tech world and the digital town square of social media, but we’ll be following the conversation closely.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

Social MediaSocial Media in the Workplace? Here’s How to Make it Work.

Social Media in the Workplace? Here’s How to Make it Work.

Twitter, Instagram and Facebook are now an everyday part of our lives, and that includes in the workplace. But while social media can be an excellent communication and marketing tool for businesses, personal use of social media at work can interfere with productivity and pose some serious data and cybersecurity risks. So how can businesses mitigate these risks and help make sure the company isn’t trending for all the wrong reasons?

Create an Acceptable Media Use Policy

Make sure you have a clearly outlined social media use policy in place, such as an Acceptable Media Use Policy. These policies typically warn employees that they:

o May not divulge trade secrets or confidential or proprietary information online

o Can be held accountable for content they post on the Internet—whether in the office, at home or on their own time—particularly if something they post or share violates other company policies

o May need approval (from a specific person or department) before posting certain types of information that could be associated with the organization, employees or customers

The most successful social media use policies also:

o Explain employee productivity expectations in conjunction with social media habits

o Provide examples of policy violations

o Explain disciplinary measures for policy violations

Overall, employees need to understand that they are ambassadors for the organization’s corporate brand. What they write on social media could be disseminated to the world—even if they only share it with their “friends.” Encourage employees to think twice before posting comments they would not say out loud or that they would not want their CEO or grandparents to see. Employees should be encouraged to use disclaimers and speak in the first person to make it clear that any opinions expressed are not those of their employer.

A note for unionized workforces: Employers operating in union environments need to be mindful of additional requirements that may impact their policies under the National Labor Relations Act (NLRA).  Under the NLRA, policies that are too broad or too restrictive might interfere with a workers’ right to complain about their employer and discuss the terms and conditions of employment with other employees. Always review any policies with counsel before implementing to make sure they are suitable for your particular circumstance.

Make Training Mandatory

Even the best social media policies won’t go far if employees aren’t properly trained on social networking’s benefits and pitfalls. Training should be succinct and interactive, including real -examples and table-top exercises on both the specifics of your social media use policy and more general best practices for using social media responsibly.

At Beckage, we encourage employers to leverage training such as Cybersecurity Best Practices 101, which covers topics like network security and protecting confidential and proprietary information. Organizations must educate employees about how a downloaded application or even a simple click can infect computers and the network at large. A critical concern about social networking platforms is that they encourage people to share personal information. Even the most cautious and well-meaning people can give away the wrong kind of information on company-approved social networking platforms.

Address Negative Incidents Promptly

If it seems like an employee is misusing social media at work or there’s a negative incident, it’s important to promptly investigate, document all conversations, review internal policies and procedures and take disciplinary action if warranted.

But be aware that workers’ speech is protected in certain situations. In addition to the National Labor Relations Act, federal and state employment laws protect employees who complain about harassment, discrimination, workplace safety violations and other issues.

Be Careful Using Social Media During the Hiring Process

Employers must exercise caution when using social networks during the recruiting or hiring processes. Social media can play a role in the screening process, but employers should consider when and how to use social media this way and weigh potential legal pitfalls.  For example, a candidate could claim that a potential employer did not offer a job because of legally protected information found on a social networking site (such as race, ethnicity, age, associations, family relationships or political views)

In short, successfully managing social media in the workplace comes down to the employer’s policy: in today’s workplace all employers should have a robust policy, train on it annually, and then consistently enforce it. If you’re not sure where to start, turn to experienced legal counsel to craft a social media policy that works for your company culture and brand. The experienced team at Beckage PLLC can help navigate state and federal laws, pinpoint potential social media pitfalls, and ultimately set your employees on the path to social media savvy.

*Attorney Advertising. Prior results do not guarantee a similar outcome.

Subscribe to our newsletter.