Internet Speech Cases Turn on Procedural Grounds as the Supreme Court Articulates Strong First Amendment Protections for Content Moderation
WASHINGTON — On Monday, the U.S. Supreme Court decided to vacate and remand two cases brought in response to two state laws that forbid platforms from taking down or even deemphasizing some public posts that violate a company’s content-moderation rules. The Court held that neither the Eleventh Circuit nor the Fifth Circuit Courts had conducted a “proper facial analysis” of the First Amendment challenges to Florida and Texas laws regulating large internet platforms.
In oral arguments for NetChoice, LLC v. Paxton and Moody v. NetChoice, lawyers representing platform-technology companies said that the state laws infringe on platforms’ First Amendment rights by forcing a site like Facebook to host users and content that violate the platforms’ terms of service, including posts that violate company rules against hate speech and election disinformation. Lawyers representing Florida and Texas attempted to justify state laws impinging on the First Amendment rights of technology platforms by claiming that these corporations are common carriers like telephone companies.
While the Court remanded these cases for more fact-finding by the lower courts, it noted that the content-moderation decisions challenged by Texas and Florida’s laws are “exactly the kind of editorial judgments this Court has previously held to receive First Amendment protection” and suggested that a law which prevents a platform from deciding which speech it will or will not host “is unlikely to withstand First Amendment scrutiny.”
Free Press Senior Counsel and Director of Digital Justice and Civil Rights Nora Benavidez said:
“While Free Press believes that tech companies should bolster their platform-accountability measures across the board, the First Amendment is clear: The government does not have the right to impose rules on how companies like Meta and Google should accomplish this. While today’s decision rests on procedural grounds, Justice Kagan’s comprehensive opinion for the Court explains in very clear terms why the Florida and Texas laws will have a tough time ever passing First Amendment muster. That’s a very good thing. Getting the government involved in this way would have caused far more problems than it would have cured. These laws would have further ratcheted up the amount of hate and disinformation online while undermining both the meaning and the intent of the First Amendment.
“Social-media companies have a crucial role in shaping public attitudes, especially during pivotal election years. Regulations that give state officials control over private companies’ content-moderation decisions run afoul of the First Amendment and risk forcing platforms to keep up lies and other content that violates their terms of service. As we head into one of the most significant elections in recent memory, regulatory schemes to force platforms to keep false and harmful content up are not the answer, especially when those unconstitutional mandates are predicated on penalties that state actors impose for decisions concerning private speech.
“One of the fundamental values underpinning the First Amendment is that our government cannot dictate the terms of public debate. The Florida and Texas laws bolstered state authority to intervene into private speech. The natural byproduct of such a government mandate would be more misinformation, more extremism and more hate online.
“Tech companies’ executives have a track record of negligence when it comes to leaving up harmful content. Today’s ruling should send a message to the likes of Mark Zuckerberg and Elon Musk: Your commitment to platform integrity is protected under the First Amendment. While there could be further court proceedings based on today’s procedural holding, the Court sent a clear signal that unconstitutional efforts to regulate content moderation will face withering scrutiny.”
Background: In December 2023, Free Press released Big Tech Backslide: How Social-Media Rollbacks Endanger Democracy Ahead of the 2024 Elections, a report that documents the retreat of Meta, Twitter and YouTube from earlier pledges to protect election integrity.
Recent polling that Free Press commissioned and the African American Research Collaborative and BSP Research conducted found that 79 percent of Americans say they are concerned that the information they are seeing online is false, fake or a deliberate attempt to confuse. Meanwhile, 76 percent are concerned about encountering election misinformation and 71 percent said they believe that “social media companies should limit false or fake information about elections that could be considered anti-democratic.”