当前位置:首页 > Contact > 正文

Supreme Court hears social media cases that could reshape how Americans interact online

2024-12-27 14:54:03 Contact

Washington — The Supreme Court heard arguments Monday in a pair of social media cases that could transform online speech.

The two cases concern disputes over Republican-backed laws in Florida and Texas that aim to restrict social media companies from moderating content, which tech groups representing platforms like Facebook and X see as a violation of their First Amendment rights. 

The laws, both passed in 2021, came in response to what their backers saw as discrimination by social media platforms. The controversy followed social media companies' decisions to ban former President Donald Trump from their platforms after his handling of the Jan. 6, 2021, attack on the U.S. Capitol. (Trump's accounts were eventually reinstated.)

The states in the case argue that the social media companies should be treated like any business and be restricted from removing posts or banning users from their platforms based upon their views. But the social media companies counter that the laws infringe upon their editorial discretion, arguing that they should be treated more like news outlets. 

Both the Biden administration and Trump have weighed in on the dispute, upping the ante on the political implications. 

While Trump filed a brief in support of the state laws, arguing that a platform's "decision to discriminate against a user" is not protected under the Constitution, the Biden administration filed a brief in support of the tech groups. It argued among other things that the high court has "repeatedly held" that the presentation of speech generated by others is protected under the First Amendment, as is often seen among the opinion pages of many newspapers.

The Texas and Florida social media laws

One of the cases involves a 2021 Florida law that regulates large social media platforms in an effort to combat claims of censorship. It does so by prohibiting platforms from engaging in certain types of content moderation, while requiring platforms to notify a user if the company removes or alters a post. It also compels platforms to make disclosures about their operations and policies. 

Two tech groups challenged the Florida law in federal court in 2021. The district court blocked enforcement of the measure, determining that it likely violates the First Amendment. Florida appealed the decision, before the U.S. Court of Appeals for the 11th Circuit sided with the trade groups. 

The other case centers on a Texas law that likewise regulates platforms, imposing restrictions on content moderation, requiring a platform to notify a user when content is removed, and compelling platforms to disclose how they moderate and target content.

The tech groups challenged the Texas law in federal district court in September 2021 on a constitutional basis. The lower court blocked enforcement of two of its provisions, but a federal appeals court in New Orleans allowed the law to take effect. The 5th Circuit later lifted the lower court's injunction, determining that states can regulate content-moderation activities without violating the First Amendment.

Oral arguments in the social media cases 

Over nearly four hours, the justices heard arguments in the Florida and Texas cases. In the Florida case in particular, the justices spent a significant portion of that time trying to zero in on nuanced differences between social media platforms and functions that appeared to create confusion and division about how to move forward. 

Florida Solicitor General Henry Whitaker laid out his case, arguing that the social media platforms don't have a First Amendment right to apply their censorship policies "in an inconsistent manner" and to deplatform or censor certain users. 

Whitaker said that although social media platforms achieved success by "marketing themselves as neutral forums for free speech," they now "sing a very different tune."

"They contend that they possess a broad First Amendment right to censor anything they host on their sites, even when doing so contradicts their own representations to consumers," Whitaker said. "But the design of the First Amendment is to prevent the suppression of speech, not to enable it."

Chief Justice John Roberts questioned whether the first concern should be with the state regulating "the modern public square." Whitaker argued that states have an interest in ensuring the free dissemination of ideas, since large businesses have the power to silence speakers.

The justices attempted to drill down on the distinctions between the social media platforms and newspapers, noting differences between Facebook's news feature, for example, from other aspects of the platform. 

"I feel like there's a lot of indeterminacy in this set of facts and in this circumstance," Justice Ketanji Brown Jackson said. "We're not quite sure who it covers, we're not quite sure exactly how these platforms work."

Whitaker argued that the high court needs to draw a line between a "selective speech" host that's exercising editorial control and a "common carrier" host.

Paul Clement, an attorney for the tech groups, argued that the Florida law violates the First Amendment "several times over" by interfering with editorial discretion, compelling speech and discriminating on the basis of content, speaker and viewpoint.

"Given the vast amount of material on the internet in general and on these websites in particular, exercising editorial discretion is absolutely necessary to make the websites useful for users and advertisers," Clement said. "And the closer you look at Florida's law, the more problematic the First Amendment problems become."

The justices grilled Clement on the different applications of the law on various social media platforms as well, though he asserted that the motivation for the legislation is about expressive activity on platforms like YouTube and Facebook, making it a "classic editorial decision."

Justice Amy Coney Barrett posited that even if she agrees with Clement in his argument as it relates to the core social media platforms, the justices must consider questions about how a ruling would apply to other platforms like Uber and Etsy because they must look at the statute as a whole. 

The justices repeatedly came back to the distinction between direct messaging and e-commerce sites, and more traditional social media forums, often appearing confounded by how to rule on the matter due to the possibility of its widespread application. 

U.S. Solicitor General Elizabeth Prelogar, who also argument against the Florida law, encouraged the court to take a "really narrow" approach, reserving judgment on the application for e-commerce sites, which she said aren't creating a comparable product.

In the Texas case, which appeared to apply more narrowly to traditional social media sites, the justices still appeared troubled by a clear path forward.  

Texas Solicitor General Aaron Nielson argued that "if platforms that passively host the speech of billions of people are themselves the speakers and can discriminate, there will be no public square to speak of."

Nielsen said the implications are "gravely serious," arguing that the Texas law is a "modest effort" to regulate the power of social media. 

The arguments illustrated an issue that has likewise plagued Congress when it comes to regulating social media and internet companies — lawmakers and justices aren't generally experts on the functioning of rapidly changing technology, posing a serious challenge for efforts to keep up with and corral the burgeoning fields.

    In:
  • Supreme Court of the United States
  • Social Media
  • Facebook
  • Donald Trump
  • Twitter
Kaia Hubbard

Kaia Hubbard is a politics reporter for CBS News Digital based in Washington, D.C.

最近关注

友情链接