In the current context, the Supreme Court has agreed to hear two cases that involve conflicting state-level rulings on the censorship of online speech. This hearing raises important First Amendment issues, and the decision made by the high court could significantly impact the future of online discussions.
One of the key questions being raised is whether social media platforms and online discussion spaces should be considered platforms or publishers. Do they actively curate online conversations, or are they impartial entities merely providing a space for civic discourse? These questions have been pondered by regulators and political watchdogs since the early days of the internet, but have become even more relevant with the emergence of social media platforms like Facebook, X, and YouTube. The Supreme Court may provide some clarity on these issues on Monday.
In 2021, social media platforms like X (formerly Twitter) and Facebook banned the accounts of President Donald Trump. This decision followed years of complaints from conservatives who believed that their views were being suppressed by platform holders. The ban prompted some states to enact laws that would require platforms to host content that they would otherwise moderate or remove, and to provide explanations for any moderation decisions. Laws such as Florida’s SB 7072 and Texas’s HB 20 are notable examples of this trend.
NetChoice and the Computer and Communications Industry Association quickly challenged these laws, arguing that platforms have the right to curate and moderate their own spaces as they see fit. These groups also argue that the requirement to provide detailed explanations for every moderation decision is overly burdensome.
Interestingly, the two laws had different outcomes when challenged in state-level courts. Florida successfully defended its legislation, while NetChoice managed to block the Texas law. This led to conflicting rulings from different federal appellate courts, prompting the states to seek a definitive answer from the Supreme Court.
The cases Moody vs. NetChoice and NetChoice vs. Paxton are rooted in First Amendment arguments presented by all parties involved. NetChoice and their representatives argue that forcing platforms to host content they would otherwise remove constitutes compelled speech, which violates the platform’s First Amendment rights. On the other hand, states argue that social media giants violate users’ free speech rights by censoring or banning them. The Supreme Court has agreed to resolve this dispute.
The outcome of these cases will have significant and wide-reaching implications for online discussions, impacting more than just social media platforms.
In an amicus brief, the Wikimedia Foundation – owner and operator of Wikipedia – stated, \”These statutes would deny operators of online platforms editorial control over their own websites and force them to publish speech they do not wish to disseminate.\” Many other online publishers, such as the Reporters Committee for Freedom of the Press, American Booksellers for Free Expression, and the Motion Picture Association, have also co-authored a separate SCOTUS amicus brief supporting online content moderation.
These cases are landmark events, with far-reaching consequences regardless of the Supreme Court’s ruling. If the justices uphold the platforms’ right to moderate content, it could set a precedent preventing future attempts to regulate these companies in order to protect individuals’ First Amendment rights. Conversely, if states are able to dictate how platforms moderate online content, hosting speech could become a complex and inconsistent process across state lines, making compliance in all 50 states nearly impossible.
Image credit: Fred Schilling