The aftermath of the Supreme Courts NetChoice ruling – The Verge

Last weeks Supreme Court decision in the NetChoice cases was overshadowed by a ruling on presidential immunity in Trump v. US that came down only minutes later. But whether or not America even noticed NetChoice happen, the decision is poised to affect a host of tech legislation still brewing on Capitol Hill and in state legislatures, as well as lawsuits that are percolating through the system. This includes the pending First Amendment challenge to the TikTok ban bill, as well as a First Amendment case about a Texas age verification law that the Supreme Court took up only a day after its NetChoice decision.

The NetChoice decision states that tech platforms can exercise their First Amendment rights through their content moderation decisions and how they choose to display content on their services a strong statement that has clear ramifications for any laws that attempt to regulate platforms algorithms in the name of kids online safety and even on a pending lawsuit seeking to block a law that could ban TikTok from the US.

When the platforms use their Standards and Guidelines to decide which third-party content those feeds will display, or how the display will be ordered and organized, they are making expressive choices, Justice Elena Kagan wrote in the majority opinion, referring to Facebooks News Feed and YouTubes homepage. And because that is true, they receive First Amendment protection.

NetChoice isnt a radical upheaval of existing First Amendment law, but until last week, there was no Supreme Court opinion that applied that existing framework to social media platforms. The justices didnt rule on the merits of the cases, concluding, instead, that the lower courts hadnt completed the necessary analysis for the kind of First Amendment challenge that had been brought. But the decision still provides significant guidance to the lower courts on how to apply First Amendment precedent to social media and content moderation. The Fifth Circuit was wrong in concluding that Texass restrictions on the platforms selection, ordering, and labeling of third-party posts do not interfere with expression, Kagan wrote of the appeals court that upheld Texas law seeking to prevent platforms from discriminating against content on the basis of viewpoint.

The decision is a revealing look at how the majority of justices view the First Amendment rights of social media companies something thats at issue in everything from kids online safety bills to the TikTok ban.

The court is already set to hear Free Speech Coalition v. Paxton next term a case challenging Texas HB 1181, which requires internet users to verify their ages (sometimes with government-issued IDs) to access porn sites. Free Speech Coalition, an adult entertainment industry group that counts Pornhub among its members, sued to block the law but lost on appeal. The justices decision in that case next year has the potential to impact many different state and federal efforts to age-gate the internet.

One recently signed law that may need to contend with the ruling is New Yorks Stop Addictive Feeds Exploitation (SAFE) for Kids Act, which requires parental consent for social media companies to use addictive feeds on minors. The NetChoice ruling calls into question how far legislatures can go in regulating algorithms that is, software programmed to surface or deprioritize different pieces of information to different users.

A footnote in the majority opinion says the Court does not deal here with feeds whose algorithms respond solely to how users act online giving them the content they appear to want, without any regard to independent content standards. The note is almost academic in nature platforms usually take into account many different variables beyond user behavior, and separating those variables from each other is not a straightforward matter.

Because its so hard to disentangle all of the users preferences, and the guidance from the services, and the editorial decisions of those services, what youre left with technologically speaking is algorithms that promote content curation. And it should be inevitably assumed then that those algorithms are protected by the First Amendment, said Jess Miers, who spoke to The Verge before departing her role as senior counsel at center-left tech industry coalition Chamber of Progress, which receives funding from companies like Google and Meta.

The Supreme Court made it pretty clear, curation is absolutely protected.

Thats going to squarely hit the New York SAFE Act, which is trying to argue that, look, its just algorithms, or its just the design of the service, said Miers. The drafters of the SAFE Act may have presented the law as not having anything to do with content or speech, but NetChoice poses a problem, according to Miers.The Supreme Court made it pretty clear, curation is absolutely protected.

Miers said the same analysis would apply to other state efforts, like Californias Age Appropriate Design Code, which a district court agreed to block with a preliminary injunction, and the state has appealed. That law required platforms likely to be used by kids to consider their best interests and default to strong privacy and safety settings. Industry group NetChoice, which also brought the cases at issue in the Supreme Court, argued in its 2022 complaint against Californias law that it would interfere with platforms own editorial judgments.

To the extent that any of these state laws touch the expressive capabilities of these services, those state laws have an immense uphill battle, and a likely insurmountable First Amendment hurdle as well, Miers said.

Michael Huston, a former clerk to Chief Justice Roberts who co-chairs law firm Perkins Coies Appeals, Issues & Strategy Practice, said that after this ruling, any sort of ban on content curation would be subject to a level of judicial scrutiny that is difficult to overcome. A law that, for instance, requires platforms to only show content in reverse-chronological order, would likely be unconstitutional. (TheCalifornias Protecting Our Kids from Social Media Addiction Act, which would prohibit the default feeds shown to kids from being based on any information about the user or their devices, or involve recommending or prioritizing posts, is one such real-life example.) The court is clear that there are a lot of questions that are unanswered, that its not attempting to answer in this area, Huston said. But broadly speaking ... theres a recognition here that when the platforms make choices about how to organize content, that is itself a part of their own expression.

The new Supreme Court decision also raises questions about the future of the Kids Online Safety Act (KOSA), a similar piece of legislation at the federal level thats gained significant steam. KOSA seeks to create a duty of care for tech platforms serving young users and allows them to opt out of algorithmic recommendations. Now with the NetChoice cases, you have this question as to whether KOSA touches any of the expressive aspects of these services, Miers said. In evaluating KOSA, a court would need to assess does this regulate a non-expressive part of the service or does it regulate the way in which the service communicates third-party content to its users?

Supporters of these kinds of bills may point to language in some of the concurring opinions (namely ones written by Justices Amy Coney Barrett and Samuel Alito) positing scenarios where certain AI-driven decisions do not reflect the preferences of the people who made the services. But Miers said she believes that kind of situation likely doesnt exist.

David Greene, civil liberties director at the Electronic Frontier Foundation, said that the NetChoice decision shows that platforms curation decisions are First Amendment protected speech, and its very, very difficult if not impossible for a state to regulate that process.

Similarly important is what the opinion does not say. Gautam Hans, associate clinical professor and associate director of the First Amendment Clinic at Cornell Law School, predicts there will be at least some state appetite to keep passing laws pertaining to content curation or algorithms, by paying close attention to what the justices left out.

What the Court has not done today is say, states cannot regulate when it comes to content moderation, Hans said. It has set out some principles as to what might be constitutional versus not. But those principles are not binding.

There are a couple different kinds of approaches the court seems open to, according to experts. Vera Eidelman, staff attorney at the American Civil Liberties Union (ACLU)s Speech, Privacy, and Technology Project, noted that the justices pointed to competition regulation also known as antitrust law as a possible way to protect access to information. These other regulatory approaches could, the Supreme Court seems to be hinting, either satisfy the First Amendment or dont raise First Amendment concerns at all, Eidelman said.

Transparency requirements also appear to be on the table, according to Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights. He said the decision implies that a standard for requiring businesses to disclose certain information created under Zauderer v. Office of Disciplinary Counsel is good law, which could open the door to future transparency legislation. When it comes to transparency requirements, its not that the Texas and Florida legislatures necessarily got it right, Barrett said. Their individualized explanation requirements may have gone too far, even under Zauderer. But disclosure requirements are going to be judged, according to Justice Kagan, under this more deferential standard. So the government will have more leeway to require disclosure. Thats really important, because thats a form of oversight that is far less intrusive than telling social media companies how they should moderate content.

The justices opinion that a higher bar was required to prove a facial challenge to the laws meaning that they were unconstitutional in any scenario could be reason enough for some legislatures to push ahead. Greene said states could potentially choose to pass laws that would be difficult to challenge unless they are enforced since bringing a narrower as-applied challenge before enforcement means platforms would have to show theyre likely to be targets of the law. But having a law on the books might be enough to get some companies to act as desired, Greene said.

Still, the areas the justices left open to potential regulation might be tricky to get right. For example, the justices seem to maintain the possibility that regulation targeting algorithms that only take into account users preferences could survive First Amendment challenges. But Miers says that when you read the court opinion and they start detailing what is considered expression, it becomes increasingly difficult to think of a single internet service that doesnt fall into one of the expressive capabilities or categories the court discusses throughout. What initially seems like a loophole might actually be a null set.

Justice Barrett included what seemed to be a lightly veiled comment about TikToks challenge to a law seeking to ban it unless it divests from its Chinese parent company. In her concurring opinion, Barrett wrote, without naming names, that a social-media platforms foreign ownership and control over its content moderation decisions might affect whether laws overriding those decisions trigger First Amendment scrutiny. Thats because foreign persons and corporations located abroad do not have First Amendment rights like US corporations do, she said.

Experts predicted the US government would cite Justice Barretts opinion in their litigation against TikTok, though cautioned that the statement of one justice does not necessarily reflect a broader sentiment on the Court. And Barretts comment still beckons for a greater analysis of specific circumstances like TikToks to determine who really controls the company.

Barretts concurrence notwithstanding, TikTok has also notched a potentially useful ammunition in NetChoice.

Id be feeling pretty good if I were them today, Greene said of TikTok. The overwhelming message from the NetChoice opinions is that content moderation is speech protected by the First Amendment, and thats the most important holding to TikTok and to all the social media companies.

Still, Netchoice does not resolve the TikTok case, said NYUs Barrett. TikToks own legal challenge implicates national security, a matter in which courts tend to defer to the government.

The idea that there are First Amendment rights for the platforms is helpful for TikTok, Hans said. If Im TikTok, Im mostly satisfied, maybe a little concerned, but you rarely get slam dunks.

See the article here:
The aftermath of the Supreme Courts NetChoice ruling - The Verge

Related Posts

Comments are closed.