Today, the Supreme Court heard arguments in Gonzalez v. Google, a case involving Section 230.

The outcome of this case could potentially reshape the internet.

Why?

Section 230 is a federal law that says tech platforms aren’t liable for their users posts.

Gonzalez v. Google is a case in which the family of a man killed in an ISIS attack is suing Google.

The Gonzalez family argues that Google is responsible for promoting ISIS content through its algorithms.

If the court rules in favor of the Gonzalez family, it could set a precedent that would make tech companies liable for the content promoted by their algorithms.

Tech companies would have to invest more in content moderation and develop new algorithms to detect and remove harmful content, potentially limiting free speech and expression.

On the other hand, if the court rules in favor of Google, it could reaffirm Section 230 and ensure that tech companies continue to enjoy broad protection from liability.

Some experts fear that the court isn’t well-equipped to rule in this area as it historically hasn’t been great at grappling with new technology.

Supreme Court Justice Elena Kagan stated today that they’re not “the nine greatest experts on the Internet.”

A decision will be reached this summer. Here’s what we learned from today’s opening arguments.

Gonzalez v Google: Oral Arguments

Coming out of today’s opening arguments, the Supreme Court justices are concerned about the unintended consequences of allowing websites to be sued for recommending user content.

Attorneys representing different parties were asked questions about how to protect innocuous content while holding harmful content recommendations accountable.

Additionally, the justices worry about the impact of such a decision on individual users of YouTube, Twitter, and other social media platforms.

Concerns are that narrowing Section 230 could lead to a wave of lawsuits against websites alleging antitrust violations, discrimination, defamation, and infliction of emotional distress.

In Defence Of Google

Lisa Blatt, a lawyer representing Google in this case, argues that tech companies aren’t liable for what their algorithms promote because they aren’t responsible for the choices and interests of their users.

Algorithms are designed to surface content based on what users have expressed interest in seeing, not to promote harmful or illegal content.

Google and other tech companies don’t create content or control users’ posts. They provide a platform for users to share their thoughts, ideas, and opinions.

Holding tech companies liable for the content promoted by their algorithms would have a chilling effect on free speech and expression.

It would force tech companies to engage in more aggressive content moderation, potentially limiting the free flow of ideas and information online.

This could stifle innovation and creativity, undermining the essence of the internet as an open space for communication and collaboration.

Section 230 of the Communications Decency Act was designed to protect tech companies from this liability.

It recognizes the importance of free expression and the impossibility of policing content posted by millions of users.

Google’s attorney argues that the courts should respect this precedent and not create new rules that could have far-reaching consequences for the future of the internet.

Arguments Against Google

Eric Schnapper, representing the plaintiffs in this case, argues that Google and other tech companies should be held liable because they can influence what users see on their platforms.

Algorithms aren’t neutral or objective. They’re designed to maximize engagement and keep users on the platform, often by promoting sensational or controversial content.

It can be argued that Google and other tech companies are responsible for preventing the spread of harmful content.

When they fail to take appropriate action, they can be seen as complicit in spreading the content, which can have serious consequences.

Allowing tech companies to avoid liability for the content promoted by their algorithms could incentivize them to prioritize profit over public safety.

Critics of Section 230 suggest that the Supreme Court should not interpret it in such a way that allows tech companies to evade their responsibility.

Expert Legal Analysis: What’s Going To Happen?

Search Engine Journal contacted Daniel A. Lyons, Professor at Boston College Law School, for his legal opinion on today’s opening arguments.

The first thing Lyons notes is that the petitioners struggled to make a clear and concise argument against Google:

“My sense is that the petitioners did not have a good day at argument. They seemed to be struggling to explain what precisely their argument was–which is unsurprising, as their argument has shifted many times over the course of this litigation. Multiple lines of questions showed the justices struggling with where to draw the line between user speech and the platform’s own speech. The petitioners did not really answer that question, and the Solicitor General’s answer (that Section 230 should not apply anytime the platform makes a recommendation) is problematic in both legal and policy terms.”

Lyons notes that Justice Clarance Thomas, an advocate for narrowing the scope of Section 230, was particularly hostile:

“I was surprised at how hostile Justice Thomas seemed to be toward the Gonzalez arguments. Since 2019, he has been the loudest voice on the court for taking a Section 230 case to narrow the scope of the statute. But he seemed unable to accept the petitioners’ arguments today. On the other hand, Justice Brown Jackson surprised me with how aggressively she went after the statute. She has been silent so far but seemed the most sympathetic to the petitioners today.”

The most likely path forward, Lyons believes, is that the Supreme Court will dismiss the cast against Google:

“Justice Barrett suggested what I suspect is the most likely path forward. If Twitter wins the companion case being argued tomorrow, that means that hosting/recommending ISIS content is not a violation of the Anti Terrorism Act. Because Gonzalez sued on the same claim, this would mean the court could dismiss the Gonzalez case as moot–because whether Google is protected by Section 230 or not, Gonzalez loses either way. I’ve thought for awhile this is a likely outcome,and I think it’s more likely given how poorly Gonzalez fared today.”

Then again, it’s still too early to call it, Lyons continues:

“That said, it’s unwise to predict a case outcome based on oral argument alone. It’s still possible Google loses, and even a win on the merits poses risks, depending on how narrowly the court writes the opinion. It’s possible that the court’s decision changes the way that platforms recommend content to users–not just social media companies like YouTube and Facebook, but also companies as varied as TripAdvisor, Yelp, or eBay. How much will depend on how the court writes the opinion, and it’s far too early to predict that.”

The whole three-hour oral argument can be heard in its entirety on YouTube.


Featured Image: No-Mad/Shutterstock





Source link

Why Gonzalez v. Google Matters