Missouri v. Biden: Tackling Health Misinformation on Social Media in a “Post-Covid” Era

By: Anushka Parihar

Though we are arguably not in a “Post-Covid” era, with more than 32,800 new Covid-19 hospitalizations from January 7-13, 2023, public health measures have been greatly loosened following the official end to the federal Covid-19 public health emergency in May 2023. Unfortunately, even as Americans have stopped wearing masks and social distancing, the prevalence of online health misinformation continues to rise

Health misinformation is defined as false, inaccurate, or misleading medical information according to the best available evidence at the time. Due to the unparalleled access to misinformation on social media platforms like Meta, X, and YouTube, Americans encounter misleading information about a variety of health topics such as vaccines, reproductive health, and smoking products and drugs

This poses a serious threat to public safety by undermining public health guidelines and the work of medical professionals. Social media companies have attempted to alter their content moderation policies in the past, only to be hit with lawsuits aimed at limiting their ability to moderate false posts. 

The necessity of a legal solution is clear. Currently, no laws or precedent provide a cause of action against online media for promoting health misinformation on their platforms. This is largely because any regulation centered around health misinformation has to overcome a significant barrier: the First Amendment.

S.2448: The Failed Solution

In 2021, Senator Amy Klobuchar (D-Mn) introduced S.2448, the Health Misinformation Act. The bill attempted to hold social media companies accountable for promoting health misinformation during a public health emergency by limiting the liability protection afforded to those companies under Section 230 of the Communications Decency Act. Under the bill, Section 230 protection would not apply to companies that utilize an algorithm to push content with health misinformation to users.

This bill died in the 2021-2023 session with many raising potential First Amendment issues relating to government censorship of free speech.

SCOTUS to Take a Stand

In a move that is sure to provide some insight on how far the government can go to combat health misinformation on social media, the Supreme Court (SCOTUS) has agreed to hear arguments in Missouri v. Biden. Originally filed by Missouri’s Attorney General Eric Schmitt in 2020, this suit alleges that the Biden administration worked with large social media companies such as Meta, Twitter (now X), and YouTube to “censor and suppress free speech” related to Covid-19 and other topics such as election information. President Biden’s lawyers reject this allegation, arguing that they requested but never forced these companies to stop the spread of posts that questioned the safety and efficacy of the Covid-19 vaccine and government shutdown measures.

US District Judge Doughty issued a preliminary injunction against Biden officials in July 2023, preventing them from contacting social media services for the purpose of “urging, encouraging, pressuring, or in any manner the removal, deletion, suppression, or reduction of content containing protected free speech.” In September 2023, the Fifth Circuit Court of Appeals upheld the district court injunction. 

SCOTUS is set to hear arguments in March 2024 and has lifted the injunctions set by the lower courts in the meantime. While the ruling in Missouri v. Biden will pertain to multiple topics that Missouri claims were censored by the US government, it will also provide insight into how online health misinformation is to be dealt with going forward. SCOTUS is set to hear numerous cases involving First Amendment protections on social media this year, so it is likely that the court may actually state a test or criteria for determining if and when the government is allowed to ask social media companies to censor their content. 

Perhaps if the only issue at stake was regarding health misinformation, SCOTUS might determine that government intervention is key in keeping the public safe. However, given the fact that this lawsuit couples health misinformation with election materials and other information that spreads quickly on social media, it seems unlikely that SCOTUS will rule in favor of the Biden administration. Justices Alito, Thomas, and Gorsuch have already dissented to the lifting of the injunctions, saying that it gives the government a “green light” to influence the dissemination of news by asking social media platforms to alter how users’ posts are viewed. 

The issue seemingly boils down to what will be more important: combating false and misleading information about key issues or preserving the constitutional right to freedom of speech. It is important to note that the First Amendment is not an absolute right, and the Court has established certain limits on protected speech. Though government censorship of free speech feels inherently antithetical to democracy, there are certain forms of speech that are not afforded Constitutional protections. Incitement, defamation, fraud, obscenity, and child pornography are all exceptions to the First Amendment. All of these exceptions have one thing in common: the potential for harm. 
The potential for harm caused by the spread of health misinformation is broad. Covid misinformation resulted in a number of preventable deaths, so future instances of health misinformation should be considered legitimate threats to public health. If SCOTUS is able to provide us with a way to combat this issue by allowing government officials to request that social media platforms attempt to curb the spread of misinformed content, it could help shape public health and safety in a meaningful way.

Leave a comment