The U.S. Supreme Court gave internet and social media firms two triumphs by upholding their legal protections and refusing to open the door for individuals who had been the victims of terrorist attacks to sue these companies under anti-terrorism legislation.
The Supreme Court declined to rule on a motion to alter Section 230 of the Communications Decency Act, which protects internet corporations from legal action for user-posted content. The issue included Google LLC’s YouTube video-sharing site. In a subsequent decision, the court protected Twitter Inc. from lawsuits attempting to enforce the Anti-Terrorism Act, a federal statute.
In both cases, families of people killed by Islamist gunmen overseas had sued to try to hold internet companies liable because of the presence of militant groups on their platforms or for recommending their content.
The justices in a 9-0 decision reversed a lower court’s ruling that had revived a lawsuit against Twitter by the American relatives of Nawras Alassaf, a Jordanian man killed in a 2017 attack during New Year’s celebration in a Istanbul nightclub claimed by the Islamic State militant group.
In the case involving YouTube, which along with Google is part of Alphabet Inc, the justices returned to a lower court a lawsuit by the family of Nohemi Gonzalez, a college student from California who was fatally shot in an Islamic State attack in Paris in 2015. The justices declined to address the scope of Section 230, concluding that they did not need to take that step because the family’s claims appeared likely to fail given the ruling in the Twitter case.
Section 230 provides safeguards for “interactive computer services” by ensuring they cannot be treated for legal purposes as the “publisher or speaker” of information provided by users.
Calls have come from across the ideological and political spectrum – including Democratic President Joe Biden and his Republican predecessor Donald Trump – for a rethink of Section 230 to ensure that companies can be held accountable for content on their platforms. This case marked the first time the Supreme Court had examined Section 230’s reach.
“Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” said Google General Counsel Halimah DeLaine Prado. “We’ll continue our work to safeguard free expression online, combat harmful content and support businesses and creators who benefit from the internet.”
Critics have said Section 230 too often prevents platforms from being held accountable for real-world harms. Many liberals have condemned misinformation and hate speech on social media. Many conservatives have said voices on the right are censored by social media companies under the guise of content moderation.
The Istanbul massacre on Jan. 1, 2017, killed Alassaf and 38 others. His relatives accused Twitter of aiding and abetting the Islamic State, which claimed responsibility for the attack, by failing to police the platform for the group’s accounts or posts in violation of the Anti-Terrorism Act, which enables Americans to recover damages related to “an act of international terrorism.”
Gonzalez’s family argued that YouTube provided unlawful assistance to the Islamic State by recommending the militant group’s content to users. In their brief ruling on Thursday, the justices wrote that they “decline to address the application of (Section 230) to a complaint that appears to state little, if any, plausible claim for relief.”
Twitter and its backers had said that allowing lawsuits like the one brought by Alassaf’s family would threaten internet companies with liability for providing widely available services to billions of users because some of them may be members of militant groups, even as the platforms regularly enforce policies against terrorism-related content.
The case hinged on whether the family’s claims sufficiently alleged that the company knowingly provided “substantial assistance” to an “act of international terrorism” that would allow the relatives to maintain their suit and seek damages under the anti-terrorism law.
After a judge dismissed the lawsuit, the San Francisco-based 9th U.S. Circuit Court of Appeals in 2021 allowed it to proceed, concluding that Twitter had refused to take “meaningful steps” to prevent Islamic State’s use of the platform.
Conservative Justice Clarence Thomas, who authored the ruling, said the allegations made by the plaintiffs were insufficient because they “point to no act of encouraging, soliciting or advising the commission” of the attack.
“Rather, they essentially portray defendants as bystanders, watching passively as ISIS carried out its nefarious schemes,” Thomas added.
Biden’s administration supported Twitter, saying the Anti-Terrorism Act imposes liability for assisting a terrorist act and not for “providing generalized aid to a foreign terrorist organization” with no causal link to the act at issue.
In the Twitter case, the 9th Circuit did not consider whether Section 230 barred the family’s lawsuit. Google and Meta’s Facebook, also defendants, did not formally join Twitter’s appeal.