The Supreme Court is about to hear a crucial Sector 230 case
Washington has been at a loss for years about how to regulate the Internet — or even if it should even try. But the Supreme Court will be hearing a case next week that could completely change our online world as we know it.
On Tuesday, judges will hear arguments for Gonzalez vs Google, a case challenging Section 230 of the Communications Decency Act, a 1996 law that gives Internet platforms immunity to most third-party content posted on their websites. The arguments will revolve around technical algorithms that plaintiffs say boosted extremist messaging in the run-up to a terrorist attack. They argue that the protections of Section 230 should not apply to the content that a company’s algorithm recommends online and that Google is therefore legally liable for the extremist videos posted on its YouTube service.
While the hearing is scheduled for next week, a resolution is not expected until June.
Section 230 is why companies like Facebook or Twitter are not liable for content users create and why a website is not legally at fault if someone writes a defamatory review. But it has come under fire in recent years from critics who say it facilitates misinformation and protects websites known for spreading hateful and extremist rhetoric. But experts also fear that rollbacks to Section 230 could go too far and irretrievably destroy the foundations of free speech on which the Internet was built.
Recent AI developments like ChatGPT have added a new dimension to the 230+ battle as the bots, which have so far proven unreliable in providing accurate information and delivering the right facts, could soon be protected by the law.
Some experts say the Supreme Court’s decisions in these cases could present a unique opportunity to set the rules for Section 230, but others warn that going too far could completely undermine 230 and render our relationship with the internet almost unrecognizable.
“As the digital world becomes more intertwined with our physical world, this becomes more urgent,” said Lauren Krapf, lead counsel for technology policy and advocacy at the Anti-Defamation League, an antidiscrimination group wealth.
The backbone of the modern web
Section 230 made the internet work the way it works today by allowing websites to post most content without fear of legal liability, with a 26-word stipulation that was prominent in the creation of today’s internet was influential: “No provider or user of an interactive computer service shall be treated as a publisher or speaker of information provided by another information content provider.”
The Electronic Frontier Foundation, a digital rights organization, says that without Section 230, “the free and open internet as we know it could not exist,” while the legal provision protecting internet companies is often referred to as “the 26 words that were created” means the Internet.”
But those words, written more than a quarter century ago, have come under scrutiny in recent years, and politicians on both sides of the aisle have targeted 230 as part of larger efforts to regulate the internet. Even tech leaders, including Meta CEO Mark Zuckerberg, have suggested that Congress should require platforms to prove they have systems in place to identify illegal content. But how and to what extent the law should be further developed has so far been divided.
“We’re at a point where Congress really needs to update Section 230,” Krapf said. Your organization has filed an amicus brief on Google’s case on behalf of the plaintiff, asking the Supreme Court to consider the implications of Section 230’s immunity provision.
However, given the far-reaching implications of Section 230, reaching agreement on how best to revise it is not an easy task.
“Because [Section 230] is a high stakes piece of the puzzle, I think there are a lot of different viewpoints on how it should be updated or reformed and what we should do about it,” Krapf said.
The cases
What does the Gonzalez vs Google The difference from previous attempts to refine Section 230 is that the matter is being brought before the Supreme Court for the first time, rather than before Congress, and could set a precedent for future interpretations of the law.
The focus of their argument is the dissemination of pro-terror messages on online platforms. The Gonzalez family alleges that Google-owned YouTube was involved in the radicalization of ISIS fighters in the run-up to a 2015 terrorist attack in Paris that killed 130 people – including 23-year-old Nohemi Gonzalez, an American student, who studied abroad. A lower court ruled in favor of Google, citing 230’s protections, and the Gonzalez family turned to the Supreme Court, arguing that Section 230 covers content but not the algorithmic content recommendations in question.
Google’s case isn’t the only case posing a potential challenge for Section 230 next week. A related case that the court will hear on Wednesday, Twitter against Taamnehwas brought forward by the relatives of Jordanian national Nawras Alassaf, who was one of 39 people killed in an ISIS-linked mass shooting at a nightclub in Istanbul in 2017.
Alassaf’s family sued Twitter, Google and Facebook for failing to control pro-terrorist content on their websites, a lawsuit a lower court allowed. Twitter then argued that pursuing the lawsuit was an unconstitutional extension of the anti-terrorism law and appealed the decision to the highest court. The lower court never came to a decision on the case, so Section 230 was never discussed, but it will likely come up in next week’s Supreme Court hearing.
Targeting recommendations might be a slippery slope
The Gonzalez family is asking the Supreme Court to clarify whether YouTube’s recommendations are exempt from Section 230, and exceptions to the law are not uncommon.
In 2018, former President Donald Trump signed an exemption to law that would make online sites liable for sex trafficking-related content. The difference with the Google case, however, is that the plaintiffs do not target specific content, but rather the online recommendations generated by the company’s algorithms.
“Their allegation is that their lawsuit targets YouTube’s recommendations, not the content itself, because if they target the content itself, clearly Section 230 comes into play and a lawsuit is thrown out of court,” Paul Barrett, Associate Director and senior research scholar at NYU’s Stern Center for Business and Human Rights shared this wealth.
Almost every online platform, including Google, Twitter, and Facebook, uses algorithms to generate user-curated content recommendations. However, Barrett argued that given that recommendation algorithms have become at the heart of everything tech companies do, targeting recommendations instead of content could be a slippery slope in the face of future lawsuits against online platforms.
Barrett and the center he is part of have also filed an amicus brief with the court, recognizing the need to modernize Section 230, but also arguing that the law remains a critical pillar of free speech online and that an extreme ruling could endanger the Door opens for targeting algorithms instead of content could compromise this protection.
“A recommendation is not a separate, distinct, and unusual activity for YouTube and the recommended videos. In fact, recommendation is what social media platforms generally do,” he said.
If the Supreme Court rules in favor of the Gonzalez family, it could leave Section 230 vulnerable to future lawsuits targeting online platforms’ algorithms rather than their content, Barrett said, adding that in extreme cases, it could lead to a complete erosion of protections The law grants technology companies.
“I think what you would see would be a very dramatic curtailment or reduction of what’s available on most platforms because they just don’t want to take the risk,” he said. Instead, he says online platforms would censor themselves to have significantly less “lawsuit-baiting” content.
Such extreme erosion of Section 230 would make life much harder for large companies, but could potentially pose an existential threat to smaller online platforms that are primarily crowdsourced and have fewer resources to draw on, Barrett said, including popular sites like Wikipedia.
“We wanted to sound the alarm, ‘Hey, if you’re going down this path, you might be doing more than you think,'” Barrett said.
Both Barrett and Krapf agreed that Refinement of Section 230 is probably long overdue and that this will become even more urgent as technology becomes more intertwined with our lives. Krapf described the court hearing as a good opportunity to bring clarity to Section 230, as Congress must regulate the behavior of tech companies and ensure consumers are also protected from the digital world.
“I think the urgency just keeps building,” said Krapf. “We’ve seen our reliance on our digital world really come into its own in recent years. And then, now that a new wave of technological advances is coming to the fore, we need better road rules.”