Supreme Court hears Gonzalez v. Google case against Big Tech

By Brian Fung and Tierney Sneed, CNN

Updated 5:15 p.m. ET, February 21, 2023
18 Posts
Sort byDropdown arrow
3:48 p.m. ET, February 21, 2023

Takeaways from today's oral arguments on the Gonzalez v. Google case — and how it could reshape the internet 

From CNN's Brian Fung and Tierney Sneed

Television lights are set up outside the U.S. Supreme Court on February 21, 2023 in Washington, DC.
Television lights are set up outside the U.S. Supreme Court on February 21, 2023 in Washington, DC. (Drew Angerer/Getty Images)

Supreme Court justices appeared broadly concerned Tuesday about the potential unintended consequences of allowing websites to be sued for their automatic recommendations of user content, highlighting the challenges facing attorneys that want to hold Google accountable for suggesting YouTube videos created by terrorist groups. 

For nearly three hours, the nine justices peppered attorneys representing Google, the US government and the family of Nohemi Gonzalez, an American student killed in a 2015 ISIS attack in Paris, with questions about how the court could design a ruling that exposes harmful content recommendations to liability while still protecting innocuous ones.

How – or if – the court draws that line could have significant implications for the way websites choose to rank, display and promote content to their users as they seek to avoid a litigation minefield. 

Beatriz Gonzalez and Jose Hernandez, the mother and stepfather of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, speak to the media outside the US Supreme Court today following oral arguments in Gonzalez v. Google in Washington, DC.
Beatriz Gonzalez and Jose Hernandez, the mother and stepfather of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, speak to the media outside the US Supreme Court today following oral arguments in Gonzalez v. Google in Washington, DC. (Jim Watson/AFP/Getty Images)

The attorney for the Gonzalez family argued that narrowing Section 230 of the Communications Decency Act — the federal law protecting websites’ right to moderate their platforms as they see fit — would not lead to sweeping consequences for the internet. But both the court’s liberals and conservatives worried about the impact of such a decision on everything from “pilaf [recipes] from Uzbekistan” to individual users of YouTube, Twitter and other social media platforms. 

Justices are worried about a wave of lawsuits and disruption to the internet: A big concern of the justices seems to be the waves of lawsuits that could happen if the court rules against Google. 

"Lawsuits will be nonstop," Justice Brett Kavanaugh said at one point. 

But Eric Schnapper, representing the plaintiffs, argued a ruling for Gonzalez would not have far-reaching effects because even if websites could face new liability as a result of the ruling, most suits would likely be thrown out anyway. 

"The implications are limited," Schnapper said, "because the kinds of circumstance in which a recommendation would be actionable are limited." 

Later, Justice Elena Kagan warned that narrowing Section 230 could lead to a wave of lawsuits, even if many of them would eventually be thrown out, in a line of questioning with US Deputy Solicitor General Malcolm Stewart. 

"You are creating a world of lawsuits," Kagan said. "Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit."  

Chief Justice John Roberts mused that under a narrowed version of Section 230, terrorism-related cases might only be a small share of a much wider range of future lawsuits against websites alleging antitrust violations, discrimination, defamation and infliction of emotional distress, just to name a few.  

Read more takeaways here and watch CNN's Jessica Schneider break down the case:

2:16 p.m. ET, February 21, 2023

Justice Ketanji Brown Jackson questions the intended scope of Section 230 — here's what the authors said

From CNN's Brian Fung

United States Supreme Court Associate Justice Ketanji Brown Jackson poses for an official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC. 
United States Supreme Court Associate Justice Ketanji Brown Jackson poses for an official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC.  (Alex Wong/Getty Images)

Justice Ketanji Brown Jackson questioned Lisa Blatt about Congress' original intent in passing Section 230, suggesting that the law was never meant to insulate tech platforms from lawsuits linked to algorithmic content recommendations.

"Isn’t it true that the statute had a more narrow scope of immunity than courts have ultimately interpreted it to have, and that it was really just about making sure that your platform and other platforms weren’t disincentivized to block and screen and remove offensive content?" Jackson asked, adding that recommendations were "not something the statute was directed to."

Blatt, who is representing Google in the case, disputed Jackson's narrative, saying lawsuits pertaining to recommendations fell within the scope of the authors' legislative intent to shield websites from a barrage of websites.

"That’s death by a thousand cuts, and the internet would’ve never gotten off the ground if anybody could sue at any time and it were left up to 50 states’ liability regimes," Blatt said.

The original authors of Section 230, Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, explained their thinking behind the legislation in a filing to the court.

"Congress drafted Section 230 in a technology-neutral manner that would enable the provision to apply to subsequently developed methods of presenting and moderating user-generated content," Wyden and Cox wrote. "The targeted recommendations at issue in this case are an example of a more contemporary method of content presentation."

Algorithmic recommendations, they added, are "direct descendants of the early content curation efforts that Congress had in mind when enacting Section 230. And because Section 230 is agnostic as to the underlying technology used by the online platform, a platform is eligible for immunity under Section 230 for its targeted recommendations to the same extent as any other content presentation or moderation activities."

So while Jackson is correct that Section 230 has been interpreted expansively by the courts, the intent of Section 230 is more in line with Blatt's reading than Jackson's, according to the drafters of the original law.

12:22 p.m. ET, February 21, 2023

Lawyer for Google begins: "26 words created today's internet"

From CNN's Tierney Sneed

Lisa Blatt, a well-known Supreme Court advocate who is representing Google in this case, is up for questioning. The hearing has been underway for nearly two hours.

"Section 230 C1's 26 words created today's internet," Blatt said, kicking off her presentation before the Supreme Court justices.

Here is what the text says:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
11:58 a.m. ET, February 21, 2023

Supreme Court justices are worried about a wave of lawsuits and disruption to the internet

From CNN's Brian Fung

A view of the U.S. Supreme Court today in Washington, DC. Oral arguments are taking place in Gonzalez v. Google, a landmark case about whether technology companies should be liable for harmful content their algorithms promote.
A view of the U.S. Supreme Court today in Washington, DC. Oral arguments are taking place in Gonzalez v. Google, a landmark case about whether technology companies should be liable for harmful content their algorithms promote. (Drew Angerer/Getty Images)

A big concern of justices seems to be what happens if the court rules against Google, namely a wave of lawsuits.

Justice Brett Kavanaugh asked Eric Schnapper, representing the plantiffs, to respond to various friend-of-the-court briefs warning of widespread disruption.

"We have a lot of amicus briefs that we have to take seriously that say this is going to cause" significant disruption, Kavanaugh said.

Schnapper argued that a ruling for Gonzalez would not have far-reaching effects because even if websites could face new liability as a result of the ruling, most suits would likely be thrown out anyway.

"The implications are limited," Schnapper argued, "because the kinds of circumstance in which a recommendation would be actionable are limited."

Many of the briefs Kavanaugh referenced had expressed fears of a deluge of litigation that could overwhelm startups and small businesses, whether or not they held any merit.

Later, Justice Elena Kagan warned that narrowing Section 230 could lead to a wave of lawsuits, even if many of them would eventually be thrown out, in a line of questioning with US Deputy Solicitor General Malcolm Stewart.

"You are creating a world of lawsuits," Kagan said. "Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit." 

Even as Stewart suggested many such lawsuits might not ultimately lead to anything, Justices Kavanaugh and Roberts appeared to take issue with the potential rise in lawsuits in the first place.

"Lawsuits will be nonstop," Kavanaugh said.

Chief Justice John Roberts mused that under a narrowed version of Section 230, terrorism-related cases might only be a small share of a much wider range of future lawsuits against websites alleging antitrust violations, discrimination, defamation and infliction of emotional distress, just to name a few.

"I wouldn't necessarily agree with 'there would be lots of lawsuits' simply because there are a lot of things to sue about, but they would not be suits that have much likelihood of prevailing, especially if the court makes clear that even after there's a recommendation, the website still can't be treated as the publisher or speaker of the underlying third party," Stewart said.

12:49 p.m. ET, February 21, 2023

Section 230 has been mentioned a lot today. Here are key things to know about the law 

From CNN's Brian Fung

(Adobe Stock)
(Adobe Stock)

Congress, the White House and now the US Supreme Court are all focusing their attention on a federal law that’s long served as a legal shield for online platforms.

The Supreme Court is hearing oral arguments on two pivotal cases this week dealing with online speech and content moderation. Central to the arguments is Section 230, a federal law that’s been roundly criticized by both Republicans and Democrats for different reasons but that tech companies and digital rights groups have defended as vital to a functioning internet. In today's oral arguments on Gonzalez v. Google, the law has already come up multiple times.

Tech companies involved in the litigation have cited the 27-year-old statute as part of an argument for why they shouldn’t have to face lawsuits alleging they gave knowing, substantial assistance to terrorist acts by hosting or algorithmically recommending terrorist content.

Here are key things to know about the law:

  • Passed in 1996 during the early days of the World Wide Web, Section 230 of the Communications Decency Act was meant to nurture startups and entrepreneurs. The legislation’s text recognized the internet was in its infancy and risked being choked out of existence if website owners could be sued for things other people posted.
  • Under Section 230, websites enjoy immunity for moderating content in the ways they see fit — not according to others’ preferences — although the federal government can still sue platforms for violating criminal or intellectual property laws.
  • Contrary to what some politicians have claimed, Section 230’s protections do not hinge on a platform being politically or ideologically neutral. The law also does not require that a website be classified as a publisher in order to “qualify” for liability protection. Apart from meeting the definition of an “interactive computer service,” websites need not do anything to gain Section 230’s benefits – they apply automatically.
  • The law’s central provision holds that websites (and their users) cannot be treated legally as the publishers or speakers of other people’s content. In plain English, that means any legal responsibility attached to publishing a given piece of content ends with the person or entity that created it, not the platforms on which the content is shared or the users who re-share it.
  • The seemingly simple language of Section 230 belies its sweeping impact. Courts have repeatedly accepted Section 230 as a defense against claims of defamation, negligence and other allegations. In the past, it’s protected AOL, Craigslist, Google and Yahoo, building up a body of law so broad and influential as to be considered a pillar of today’s internet. In recent years, however, critics of Section 230 have increasingly questioned the law’s scope and proposed restrictions on the circumstances in which websites may invoke the legal shield.

Read more about Section 230 here.

11:38 a.m. ET, February 21, 2023

Plaintiff's attorney says that internet users should be held liable for retweeting or sharing

From CNN's Brian Fung

(Adobe Stock)
(Adobe Stock)

Under questioning by Justice Amy Coney Barrett, attorney Eric Schnapper, representing the plaintiffs in Gonzalez v. Google, confirmed that under the legal theory he is advancing, Section 230 would not protect individual internet users retweeting, sharing or liking other people's content.

The text of Section 230 explicitly immunizes "users" from liability for the content posted by third parties, not just social media platforms.

Barrett asked Schnapper whether giving a "thumbs-up" to another user's posts, or whether taking actions including "like, retweet, or say 'check this out,'" means that she has "created new content" and thus lost Section 230's protections.

After quibbling briefly with Barrett over the definition of a user, Schnapper acknowledged that the act of liking or retweeting is an act of content creation that should expose the person liking or retweeting to potential liability.

"On your theory, I’m not protected by 230?" Barrett asked.

"That's content you've created," Schnapper replied.

11:22 a.m. ET, February 21, 2023

DOJ is now up for questioning

From CNN's Tierney Sneed

Malcolm Stewart, the US deputy solicitor general, is now up for arguments.

The Justice Department is arguing that Section 230's protections for platforms should be read more broadly than what that the plaintiffs are arguing, but that the algorithms that platforms use to make recommendations could potentially open up tech companies for liability.

11:16 a.m. ET, February 21, 2023

Amy Coney Barrett points to the exit ramp that the court has to dodge big question about Section 230

From CNN's Tierney Sneed

In this October 2020 photo, Supreme Court nominee Judge Amy Coney Barrett meets with Sen. James Lankford in Washington, DC.
In this October 2020 photo, Supreme Court nominee Judge Amy Coney Barrett meets with Sen. James Lankford in Washington, DC. (Sarah Silbiger/Pool/Getty Images)

Justice Amy Coney Barrett referenced the exit ramp the Supreme Court has that would allow it to avoid the big legal question over the scope of Section 230, the relevant law the gives internet platforms certain protections from legal liability.

She pointed to the tech case the court will hear Wednesday, in which the justices will consider whether an anti-terrorism law covers internet platforms for their failure to adequate remove terrorism-related conduct. The same law is being used by the plaintiffs to sue Google in Tuesday's case.

"So if you lose tomorrow, do we even have to reach the section 230 question here? Would you concede that you would lose on that ground here?," Justice Barrett asked Eric Schnapper, the attorney for those who have sued Google.

2:26 p.m. ET, February 21, 2023

Justice Elena Kagan: Supreme Court justices are not the "nine greatest experts on the internet"

From CNN's Tierney Sneed and Brian Fung

United States Supreme Court Associate Justice Sonia Sotomayor, Associate Justice Clarence Thomas, Chief Justice of the United States John Roberts, Associate Justice Samuel Alito, and Associate Justice Elena Kagan, Associate Justice Amy Coney Barrett, Associate Justice Neil Gorsuch, Associate Justice Brett Kavanaugh and Associate Justice Ketanji Brown Jackson pose for their official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC.
United States Supreme Court Associate Justice Sonia Sotomayor, Associate Justice Clarence Thomas, Chief Justice of the United States John Roberts, Associate Justice Samuel Alito, and Associate Justice Elena Kagan, Associate Justice Amy Coney Barrett, Associate Justice Neil Gorsuch, Associate Justice Brett Kavanaugh and Associate Justice Ketanji Brown Jackson pose for their official portrait at the East Conference Room of the Supreme Court building on October 7, 2022 in Washington, DC. (Alex Wong/Getty Images)

Justice Elena Kagan hinted that she thought that even if the arguments against Google had merits as a policy matter, Congress — rather than the Supreme Court — should be the one that steps in.

"I could imagine a world where you’re right, that none of this stuff gets protection. And you know — every other industry has to internalize the costs of its conduct. Why is it that the tech industry gets a pass? A little bit unclear," Kagan said.

On the other hand — we’re a court. We really don’t know about these sorts of things. These are not, like, the nine greatest experts on the internet," she said.

Following laughter in the courtroom, Kagan said she didn't have to accept "the sky is falling" arguments from Google's lawyers to think this was difficult and uncertain waters for the judicial branch to be wading in.

Maybe Congress wants a system that Google isn't so broadly protected, she offered: "But isn't that something for Congress to do? Not the court?"