Supreme Court Ruling on Social Media Affirms the First Amendment and the Government’s Responsibility to Protect the Public from the Harms of Online Misinformation
WASHINGTON — On Wednesday, the U.S. Supreme Court ruled in Murthy v. Missouri that the individual and state plaintiffs in the case did not have standing to seek a preliminary injunction against federal executive-branch officials and agencies over their official communications with social-media companies about the spread of misinformation online.
The case arose out of the Biden administration’s efforts to call on platforms like Facebook, Twitter and YouTube to remove user posts that contained lies about the COVID-19 pandemic and the outcome of the 2020 presidential election. The state attorneys general of Louisiana and Missouri sued the Biden administration in 2022, arguing that numerous government officials and agencies, including officials in the Surgeon General’s office, the FBI and others, violated the First Amendment in their communications with these companies and coerced them into removing content.
In 2023, Louisiana district court Judge Terry A. Doughty issued an initial ruling barring administration officials from communicating with social-media platforms. The Fifth U.S. Circuit Court of Appeals subsequently upheld Doughty’s ruling while narrowing its scope.
In today’s 6–3 decision, the Court held that the plaintiffs failed to demonstrate that they had suffered or likely would suffer harm as a result of the federal government’s communications with social-media platforms. It also ruled any such alleged harm would not be solved by the injunction sought by the plaintiffs. Writing for the majority, Justice Amy Coney Barrett noted that the record in this case indicated that platforms regularly consulted with a variety of outside experts, including federal Executive-Branch officials, about content-moderation decisions, and that platforms “continued to exercise their independent judgment” even after those conversations.
Free Press Senior Counsel and Director of Digital Justice and Civil Rights Nora Benavidez said:
“There are essential moments when our government should be allowed, even encouraged, to contact private companies like social-media platforms and provide factual information to them, especially when issues of foreign interference, election integrity, national security and encouragement of violence crop up online and pose real-world threats. In its ruling in Murthy v. Missouri, the Supreme Court didn’t reach the question of whether the First Amendment restricts government engagement with private speech. In the Court’s opinion, however, Justice Barrett did indicate that platforms should be free to regularly communicate with outside experts and officials on content-moderation issues.
“Of course, we should be wary of government intrusions into private speech, including possible official efforts to coerce social-media platforms’ behavior. We know that officials sometimes abuse their power to limit speech from dissenting and minority opinions. Risks to free speech should not be taken lightly. But the Biden administration’s efforts to fight misinformation do not amount to censorship; rather, they are efforts to make platforms aware of the potential public harms that could result from the unvetted spread of falsehoods via their networks.
“There is already a notable gap between what social-media platforms promise to do when it comes to content that violates their own terms of service — such as disinformation and hate — and what they actually do in practice. These companies also provide very little insight into their content-moderation and enforcement decisions. If the ruling had favored the state AGs, platforms could have ignored government communications even more brazenly and evaded dialogue with other sectors such as independent researchers and civil-society organizations.
“As national elections approach, platforms like Meta, Twitter, and YouTube are in full retreat from their previous commitments to apply robust policies on misinformation and have laid off thousands of workers responsible for ensuring real safety for users — and U.S. voters. This is a perilous moment as we head toward November with social-media platforms playing a powerful role that is ripe for manipulation and interference. For now, the Court’s decision permits government officials to continue advising these platforms about possible and real threats. We hope that this ruling will encourage platforms to be more accountable for taking down the lies they help spread.”
Background: In recent polling that Free Press commissioned and the African American Research Collaborative and BSP Research conducted, 79 percent of Americans said they are concerned that the information they are seeing online is false, fake or a deliberate attempt to confuse. Meanwhile, 76 percent are concerned about encountering election misinformation and 71 percent said they believe that “social media companies should limit false or fake information about elections that could be considered anti-democratic.”