Moody v. NetChoice: Supreme Court Ruling on Social Media Free Speech

The Supreme Court ruled that social media platforms’ content moderation is protected by the First Amendment, blocking state attempts to control how online speech is managed.
👨‍⚖️
Are you an attorney? Check out Counsel Stack legal research at www.counselstack.com

Key Takeaways

  1. No Social Media Exception to the First Amendment: The Supreme Court unanimously vacated lower court rulings, reaffirming that content moderation by online platforms is a form of protected speech under the First Amendment. State laws cannot compel platforms to host or prioritize certain speech without a rigorous constitutional review.
  2. State Regulation of Content Moderation Faces High Scrutiny: The Court held that state laws in Florida and Texas, which sought to restrict social media companies’ content moderation practices, must be thoroughly examined under First Amendment principles before enforcement.
  3. Editorial Discretion of Online Platforms Is Protected: The decision preserves the autonomy of social media companies to moderate, curate, and organize content according to their own policies, without undue government interference.

Introduction

In 2024, the United States Supreme Court decided two landmark cases—Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton—that addressed the intersection of state regulation, the First Amendment, and the content moderation practices of major internet platforms. These cases, docketed as 22-277 and 22-555, respectively, stemmed from efforts by Florida and Texas to enact laws that would significantly curb the ability of large social media companies to manage the content posted by their users.

These cases are pivotal in the ongoing debate about the role of government in regulating online speech, the rights of private companies to exercise editorial discretion, and the extent to which the First Amendment protects digital platforms. The Supreme Court’s decision, delivered by Justice Kagan, vacated and remanded the judgments of the Eleventh and Fifth Circuits for failing to conduct a proper First Amendment analysis. The full opinion is available at the Supreme Court’s official website.

This guide provides a comprehensive overview of the legal issues, procedural history, Supreme Court reasoning, and implications of these cases for attorneys, policymakers, and anyone interested in the evolving landscape of online free speech.


Background and Procedural History

The State Laws at Issue

In recent years, state legislatures in Florida and Texas enacted laws designed to curtail the ability of major social media platforms to moderate content. These statutes were a response to concerns—primarily among conservative lawmakers—that platforms like Facebook, Twitter (now X), and YouTube were censoring or deprioritizing conservative viewpoints.

  • Florida’s S.B. 7072: This law prohibited platforms from deplatforming political candidates and required them to apply content moderation standards consistently. It also imposed transparency requirements regarding moderation policies.
  • Texas’s H.B. 20: This statute barred social media companies with more than 50 million monthly users from censoring users or content based on viewpoint and required detailed disclosures about content moderation practices.

Both laws included enforcement mechanisms and penalties for noncompliance, directly targeting the editorial discretion of large online platforms.

Trade associations NetChoice, LLC and the Computer & Communications Industry Association (CCIA), representing major online platforms, filed lawsuits challenging both statutes. They argued that the laws infringed upon the platforms’ First Amendment rights by compelling them to host speech they would otherwise remove or deprioritize.

  • Eleventh Circuit (Florida Law): The Eleventh Circuit largely blocked enforcement of Florida’s law, finding that the statute likely violated the First Amendment by intruding on the editorial discretion of platforms.
  • Fifth Circuit (Texas Law): In contrast, the Fifth Circuit upheld Texas’s law, reasoning that social media platforms were not akin to newspapers or other media with editorial rights, and thus could be subject to state regulation.

These conflicting decisions set the stage for Supreme Court review.


The Supreme Court’s Review

Questions Presented

The Supreme Court consolidated the cases to address a fundamental question: Do state laws that restrict or compel the content moderation practices of large social media platforms violate the First Amendment?

The Court focused on whether the lower courts had properly analyzed the laws’ implications for free speech, particularly considering that content moderation involves editorial judgments.

Oral Arguments

Oral arguments, held in February 2024, reflected deep concerns about government overreach, compelled speech, and the potential consequences for both online platforms and users. Justices probed the boundaries between permissible regulation and unconstitutional interference with private editorial discretion.

For a summary and analysis of the oral arguments, see SCOTUSblog’s case page.


The Supreme Court’s Decision

Holding and Reasoning

The Supreme Court, in a unanimous opinion delivered by Justice Kagan, vacated and remanded the judgments of both lower courts. The Court found that neither the Eleventh nor the Fifth Circuit had conducted the necessary rigorous First Amendment analysis.

Key Points from the Opinion

  1. Content Moderation as Protected Speech: The Court recognized that decisions by platforms to prioritize, deprioritize, or remove content are inherently expressive and thus protected under the First Amendment. This includes algorithmic curation, which shapes the user experience and reflects the platform’s editorial judgment.
  2. No Social Media Exception: The Court emphatically rejected the notion that social media platforms are exempt from First Amendment protections simply because they host third-party speech. The First Amendment does not allow the government to force private entities to carry speech against their will.
  3. Insufficient Lower Court Analysis: The Court criticized both circuits for failing to apply a rigorous First Amendment framework. The justices emphasized that any law compelling or restricting editorial discretion must be scrutinized for its impact on free expression.

The full opinion is available at supremecourt.gov.

Vacatur and Remand

Rather than issuing a definitive ruling on the constitutionality of the Florida and Texas laws, the Supreme Court vacated the lower courts’ decisions and remanded the cases for further proceedings. The lower courts were instructed to conduct a proper First Amendment analysis, weighing the statutes’ impact on editorial discretion and compelled speech.


First Amendment Analysis and Content Moderation

Editorial Discretion as Speech

The Court’s opinion reaffirmed a core principle: The First Amendment protects not just the right to speak, but also the right not to speak or to curate speech. This principle has deep roots in Supreme Court precedent, including cases involving newspapers, parade organizers, and cable operators.

  • In Miami Herald Publishing Co. v. Tornillo (1974), the Court struck down a Florida law requiring newspapers to publish replies from political candidates they criticized, holding that editorial discretion is protected speech.
  • The Court analogized these precedents to social media platforms, noting that algorithmic and human moderation are modern forms of editorial judgment.

Compelled Speech and Government Regulation

The Supreme Court’s decision underscores the dangers of compelled speech—when the government forces private actors to host, promote, or display speech they disagree with. The Court warned that allowing states to dictate content moderation policies would set a dangerous precedent, giving the government undue control over private speech.

As the Foundation for Individual Rights in Education (FIRE) observed, the ruling confirms that “there is no social media exception to the First Amendment” (FIRE analysis).

Viewpoint Discrimination and Neutrality

Both Florida and Texas justified their laws as efforts to ensure “viewpoint neutrality” on platforms. However, the Supreme Court clarified that the First Amendment generally forbids the government from compelling private parties to be neutral or to carry speech they would otherwise exclude. The right to editorial discretion includes the right to make viewpoint-based decisions.


Practical Implications for Online Platforms

Autonomy in Content Moderation

The Supreme Court’s decision is widely regarded as a victory for online platforms. It affirms that companies like Facebook, YouTube, and X (formerly Twitter) have the right to set their own community guidelines, moderate content, and curate user experiences without government interference.

This autonomy is crucial for platforms to:

  • Combat misinformation and harmful content
  • Foster diverse online communities
  • Respond flexibly to evolving social and political challenges

Risks of Government Overreach

The decision sends a clear message to state legislatures: Efforts to micromanage content moderation practices will face intense constitutional scrutiny. Laws that attempt to force platforms to host or prioritize certain speech, or that penalize them for removing content, are likely to be struck down unless they can survive a rigorous First Amendment analysis.

The Congressional Research Service provides a detailed discussion of the implications for future state laws and potential facial challenges under the Free Speech Clause (CRS report).

Impacts on Users and Public Discourse

For users, the decision means that social media platforms retain the freedom to enforce their own rules, which may include banning hate speech, misinformation, or other content deemed harmful. While some critics argue this gives platforms too much power, the alternative—government-mandated speech—poses even greater risks to free expression.


The “Publisher or Platform” Debate

A central theme in the litigation was whether social media companies are more like publishers (with editorial rights) or common carriers (subject to regulation). The Supreme Court’s opinion leans toward the publisher analogy, recognizing that platforms exercise significant editorial control through algorithms and moderation policies.

Section 230 and Federal Preemption

Although not directly at issue in these cases, Section 230 of the Communications Decency Act remains a foundational federal law protecting platforms from liability for user-generated content and for their moderation decisions. The Supreme Court’s emphasis on editorial rights dovetails with Section 230’s immunity provisions, potentially limiting the scope for state regulation.

Ongoing Legislative Efforts

Despite the Court’s ruling, state and federal lawmakers continue to propose legislation targeting online platforms’ content moderation practices. The decision in Moody v. NetChoice serves as a warning that such laws must be carefully crafted to avoid infringing on First Amendment rights.

For further reading, see the First Amendment Encyclopedia’s entry on Moody v. NetChoice.


Future Litigation and Unanswered Questions

What Happens Next?

By vacating and remanding, the Supreme Court left open the possibility that some regulations—such as transparency requirements—could survive constitutional scrutiny if they do not unduly burden editorial discretion. The lower courts must now apply the First Amendment framework outlined by the Supreme Court to the specific provisions of the Florida and Texas laws.

The Scope of Protected Editorial Discretion

One unresolved issue is the precise boundary between permissible regulation (such as requiring disclosure of moderation policies) and unconstitutional compelled speech. The outcome of the remanded cases will further clarify how far state governments can go in regulating platform conduct.

Potential for Further Supreme Court Review

Given the high stakes and evolving nature of online speech, additional cases are likely to reach the Supreme Court in the coming years. Questions about the application of the First Amendment to emerging technologies, artificial intelligence, and new forms of online expression remain unsettled.


Conclusion

Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton represent a watershed moment in the law of online speech and government regulation. The Supreme Court’s decision to vacate and remand the lower courts’ judgments underscores the critical importance of a thorough First Amendment analysis when evaluating state efforts to regulate content moderation.

At its core, the ruling reaffirms that editorial discretion—whether exercised by newspapers, parade organizers, or social media platforms—is a fundamental component of free speech. The government cannot compel private actors to host or prioritize speech against their will, and any attempt to do so must withstand the most exacting constitutional scrutiny.

Attorneys, policymakers, and platform operators must closely follow the ongoing litigation and evolving legal standards. For the most up-to-date research and analysis, visit Counsel Stack.


Disclaimer: This guide provides a general overview of Moody v. NetChoice, LLC and related legal developments. It is not a substitute for professional legal advice. The issues discussed are complex and subject to change as litigation continues. For specific legal guidance, consult a qualified attorney.

About the author
Von Wooding, Esq.

Von Wooding, Esq.

Lawyer and Founder

Counsel Stack Learn

Free and helpful legal information

Find a Lawyer
Counsel Stack Learn

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Counsel Stack Learn.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.