State laws targeting social media break First Amendment precedents
Originally published in the Summer 2022 issue of California Publisher.
By Jason M. Shepard
After Twitter and other social media companies banned President Donald Trump from their platforms, many of Trump’s supporters decried content moderation policies as among the biggest threats to the Republic.
“Silicon Valley is acting as a council of censors,” Florida Gov. Ron DeSantis said in May 2021 as he signed a bill into law that targets social media companies. “They cancel people when mobs come after somebody. They will pull them down.”
Social media companies cited Trump’s incitement of violence and spreading misinformation that led to the insurrection at the U.S. Capitol on Jan. 6, 2021 as reasons for him kicking him and others off their platforms. In response, Trump unsuccessfully sued, claiming his First Amendment rights were violated. He also created his own social media platform, Truth Social.
Now, several state legislatures are pushing back in unprecedented ways against Twitter, Facebook and YouTube, among other big tech companies.
One study found that 33 states introduced legislation in the last year aiming to require social media platforms to stop deplatforming individuals who publish content they determine to be harmful or dangerous. Most bills set fines for companies that fail to comply.
While bills failed to become law in most states, Florida and Texas passed laws now being challenged in federal courts.
On their face, the Florida and Texas laws seem clearly unconstitutional, violating longstanding principles of First Amendment law that establish private individuals and companies like Twitter have free speech rights to speak as they wish.
The laws also run afoul of a federal statute known as “Section 230” of the Communications Decency Act, which provide legal immunity to tech companies. Under Section 230, tech companies are given broad legal immunity from liability for content on their sites — but it also makes clear that they have a right to remove content in good faith they don’t want to host.
Twitter, for example, has a list of prohibited content that includes threats or glorification of violence, harassment, hateful conduct, non-consensual intimate photos or videos, or purposeful manipulation of elections, among other things.
Content moderation is happening all the time. In the last six months of 2021, for example, Twitter reported removing 5.1 million posts and suspending 1.3 million accounts for violating its terms of service.
That hasn’t stopped Republicans from reframing content moderation and deplatforming as dangerous acts of censorship by powerful tech companies — muddying the water in debates about free speech and setting up a potentially precedent-setting review by the U.S. Supreme Court.
In Florida, the state Legislature passed and Gov. Ron DeSantis signed S.B. 7072 in May 2021, heralded by the governor in a statement as “driving transparency and safeguarding Floridian’s ability to access and participate in online platforms.”
The law prevents large online platforms from deplatforming candidates running for public office. Companies that ban political candidates for statewide face fines of $250,000 per day, and $25,000 per day for non-statewide offices.
The law also requires companies to publish and consistently apply rules they use to block content and ban users.
Two trade associations representing online businesses, NetChoice, LLC and Computer & Communications Industry Association, filed a lawsuit, Netchoice LLC v. Moody, to stop the law from going into effect.
“The Act brazenly infringes and facially violates the First Amendment rights of America’s leading businesses by compelling them to host even highly objectionable content that is not appropriate for all viewers, violates their terms of service, or conflicts with the companies’ policies and beliefs,” NetChoice and CCIA argued in their complaint.
A U.S. district court judge issued a preliminary injunction blocking the law in June 2021. Judge Robert Hinkle of the U.S. District Court for the Northern District of Florida said the law likely violates the First Amendment as a viewpoint-based content restriction, and it also likely is preempted by the federal Section 230 law.
Judge Hinkle suggested the law was also problematic because it is “riddled with imprecision and ambiguity” with sections that are “especially vague.”
The 11th Circuit of Appeals, based on Atlanta, agreed in May 2022, upholding the preliminary injunction against most of the law while allowing some of the law to take effect.
“We hold that it is substantially likely that social-media companies — even the biggest ones — are ‘private actors’ whose rights the First Amendment protects, that their so-called ‘content-moderation’ decisions constitute protected exercises of editorial judgment, and that the provisions of the new Florida law that restrict large platforms’ ability to engage in content moderation unconstitutionally burden that prerogative,” the three-judge appellate panel wrote.
Florida officials announced their intent to appeal the decisions to the U.S. Supreme Court, and NetChoice and the CCIA also said they would ask the Court to take the case.
In Texas, a similar legal challenge is taking place over H.B. 20, a statute even more extreme than Florida’s.
The Texas law, passed in September 2021, treats large social media companies like public utility common carriers, such as telephone companies, that should not be allowed to engage in viewpoint discrimination of content. The law prevents deplatforming based on political viewpoints and allows individuals to sue companies directly for viewpoint discrimination.
The law also, among other things, prohibits the removal of most offensive content unless it falls within one of categories of speech unprotected by the First Amendment, such as incitement to violence or true threats. The law further prohibits companies from placing warning labels on offensive posts or using algorithms to hide offensive posts.
“Social media websites have become our modern-day public square,” Texas Gov. Greg Abbott said in a press release. “They are a place for healthy public debate where information should be able to flow freely — but there is a dangerous movement by social media companies to silence conservative viewpoints and ideas. That is wrong, and we will not allow it in Texas.”
NetChoice and CCIA filed a lawsuit, NetChoice v. Paxton, to block the Texas law. In December 2021, a district court issued a preliminary injunction stopping the law from taking effect, finding it to be a likely violation of the First Amendment.
Social media companies are not common carriers like telephone companies or the post office, wrote U.S. District Court Judge Robert Pitman of the Western District of Texas. Rather they are private communications companies that regularly engage in editorial discretion about the content they choose to publish, a right under the First Amendment based on many Supreme Court precedents.
“Social media platforms have a First Amendment right to moderate content disseminated on their platforms,” the judge wrote in temporarily enjoining the law from being enforceable.
In April 2022, the Fifth Circuit Court of Appeals, based in New Orleans, issued a one-sentence ruling overturning Judge Pitman’s temporary injunction. NetChoice appealed to the Supreme Court to vacate the appellate ruling, which it did on May 31 in a 5–4 vote.
The Supreme Court decision simply means the case returns to the district court for a full hearing, while the law remains unenforceable based on the preliminary injunction.
“Texas’s HB 20 is a constitutional trainwreck,” NetChoice Counsel Chris Marchese said in a statement. “We are relieved that the First Amendment, open internet, and the users who rely on it remain protected from Texas’s unconstitutional overreach.”
Ultimately, the Florida and Texas cases may end up back at the Supreme Court for a ruling in the merits.
The Supreme Court could take the cases for a variety of reasons. A split ruling among federal appellate courts is often a reason for the Supreme Court to take a case, to help resolve disputes and provide clarity for the legal system. The Court can also take cases raising novel questions of prominent importance.
In a joint motion seeking Supreme Court review in the Florida case, the parties wrote, “First, this case plainly presents important questions that warrant Supreme Court review. Under review in this case is a ‘first-of-its-kind law’ that regulates social media platforms. Whether and to what extent states may regulate social media platforms is an issue of profound importance.”
If the Supreme Court applies First Amendment precedents as they have in the past, both state laws should be struck down as unconstitutional.
The Court has long recognized publishers’ rights to editorial autonomy and against compelled speech, and those precedents suggest the First Amendment is clearly on the side of Twitter and other social media platforms to moderate content on their platforms.
Still, social media didn’t exist at the time many past First Amendment precedents were written, and this newly constituted Supreme Court isn’t afraid to overturn past rulings, as it did this summer in Dobbs v. Jackson Women’s Health Organization, overturning the landmark holding in Roe v. Wade about a women’s constitutional right to have an abortion.
In recent dissents, at least two justices, Justice Clarence Thomas and Justice Samuel Alito, have called into question whether past First Amendment precedents apply to platforms like Twitter.
“It is not at all obvious how our existing precedents, which predate the age of the internet, should apply to large social media companies,” Justice Alito wrote in dissenting from the decision to vacate the Fifth Circuit’s ruling in the Florida case.
If the Florida and Texas cases ultimately reach the Supreme Court, the power of several foundational First Amendment precedents under this Supreme Court will also be under scrutiny.
Jason M. Shepard, Ph.D., is professor and chair of the Department of Communications at California State University, Fullerton. His primary research expertise is in media law, and he teaches courses in journalism, and media law, history and ethics. Contact him at jshepard@fullerton.edu or Twitter at @jasonmshepard.