Murthy v. Missouri Ruling Could Allow Social-Media Platforms to Leave Up Misinformation and Violence If Supreme Court Finds that Government Communications Violate First Amendment
WASHINGTON — On Monday, the U.S. Supreme Court heard arguments in Murthy v. Missouri, a case determining whether any limits should be placed on the federal government’s ability to communicate with social-media companies to combat misinformation.
The case arose out of the Biden administration’s efforts to call on social-media platforms to remove posts that contain lies about the COVID-19 pandemic and the outcome of the 2020 presidential election. The state attorneys general of Louisiana and Missouri sued the Biden administration in 2022, arguing that numerous government officials and agencies, including officials in the Surgeon General’s office, the FBI and others, violated the First Amendment in their communications with these companies and unlawfully coerced them into removing content.
In 2023, Louisiana district court Judge Doughty issued an initial ruling barring administration officials from communicating with social-media platforms. The Fifth U.S. Circuit Court of Appeals subsequently upheld Doughty’s ruling while narrowing its scope.
The case, which will be decided in June, will have sweeping implications in the run-up to the 2024 presidential election.
Free Press Senior Counsel and Director of Digital Justice and Civil Rights Nora Benavidez said:
“In its ruling in Murthy v. Missouri, the Supreme Court should not redefine the boundaries of government coercion on private speech. There are essential moments when our government should be allowed, even encouraged, to contact private companies like social-media platforms when issues of foreign interference, national security and encouragement of violence crop up online and pose real-world threats.
“Of course, we should be wary of government intrusions into private speech, including possible official efforts to coerce social-media platforms’ behavior. We know that officials sometimes abuse their power to limit speech from dissenting and minority opinions. Risks to free speech should not be taken lightly. But the Biden administration’s efforts to fight misinformation do not amount to censorship.
“There is already a notable gap between what social-media platforms promise to do when it comes to violative content — such as disinformation and hate — and what they actually do in practice. More, these companies provide very little insight into their content-moderation and enforcement decisions. If the Supreme Court limits contact between government officials and social-media platforms, accountability will likely remain elusive. In fact, platforms may very well use such a ruling to shirk communications even more brazenly and evade dialogue with other sectors such as researchers and civil society.
“At a moment when platforms like Meta, Twitter, and YouTube have retreated from their previous commitments to apply robust policies on misinformation and have laid off tens of thousands of workers, real safety for users — and U.S. voters — hangs in the balance. This is a perilous moment as we head toward our national elections with social-media platforms continuing to play a powerful role that is ripe for manipulation and interference.”