How Israeli fight against online terrorism led to SCOTUS Section 230 case

LEAGAL AFFAIRS: How Israel’s fight against social media’s role in radicalization led to a SCOTUS case that could change the Internet.

 BEATRIZ GONZALEZ and Jose Hernandez, mother and stepfather of Nohemi Gonzalez, who was killed in an ISIS attack, in front of the Supreme Court in Washington with Shurat Hadin head Nitsana Darshan-Leitner.  (photo credit: SHURAT HADIN)
BEATRIZ GONZALEZ and Jose Hernandez, mother and stepfather of Nohemi Gonzalez, who was killed in an ISIS attack, in front of the Supreme Court in Washington with Shurat Hadin head Nitsana Darshan-Leitner.
(photo credit: SHURAT HADIN)

The impact of online terrorist propaganda is still fresh for Beatriz Gonzalez. Her daughter, Nohemi, was murdered in a 2015 Islamic State Paris attack.

“This has been an injustice,” said Gonzalez, “not only for my daughter but for everybody suffering for their lost loved ones – so many families.”

On Tuesday, Gonzalez’s call for justice was heard before the US Supreme Court, in a case alleging that Google is liable for the algorithmic amplification of terrorist content on YouTube.

Gonzalez v. Google and its Wednesday companion case, Twitter v. Taamneh, have not only questioned the responsibility and role that social media platforms play in the recruitment and radicalization of terrorists, but have the potential to reshape the Internet. The cases have challenged Section 230, a controversial code that provides websites with broad immunity from liability for third-party content.

 Person touch ''Delete app'' icon near Twitter logo are seen in this illustration taken, December 19, 2022. (credit: REUTERS/DADO RUVIC/ILLUSTRATION)
Person touch ''Delete app'' icon near Twitter logo are seen in this illustration taken, December 19, 2022. (credit: REUTERS/DADO RUVIC/ILLUSTRATION)

As Gonzalez noted, many families have been seeking justice and to prevent online terrorist impunity on social media. This includes families in Israel. These cases that could change the Internet have their roots in the Israeli fight against online terrorism. What began in Israel has arrived before one of the preeminent courts in the world.

The “knife intifada”: How terrorists use social media

In 2015, the so-called knife intifada erupted in Israel. As the name of the yearlong wave of violence suggests, the period was characterized by frequent stabbing attacks by Palestinian terrorists against Israeli civilians. However, as noted by Nitsana Darshan-Leitner, head of the NGO Shurat Hadin, which takes legal action on behalf of terrorism victims, the period was also known as the “Facebook intifada.”

The attackers were incited to violence on social media, explained Darshan-Leitner. “There were posts, thousands of posts, calling to kill, illustrating how to kill in videos, examples of how to twist the knife.”

The terrorism wave was marked by a high number of teenage attackers, some as young as 13. Martyrs were extolled with graphics and associated hashtags.

“Terrorists received tens of thousands of likes, encouraging others to follow them,” said Darshan-Leitner.

Retired US generals and former Israeli counterterrorism officials filed briefs for the Taamneh and Gonzalez cases, respectively, detailing the role of social media beyond radicalization. Terrorists recruit directly through social media, raise funds through calls for donations on the websites. They amplify the demoralizing effect of violence through livestreams, and share best terrorist practices, instructions and guides, as well as directives to operatives.

During the so-called Facebook intifada, Darshan-Leitner said that Prime Minister Benjamin Netanyahu’s government had urged “Facebook to take the incitement down, but Facebook refused. Facebook said that they were only a neutral bulletin board.

“So we decided to sue Facebook,” said Darshan-Leitner.

Force v. Facebook

Taylor Force was a business student and an American veteran of Iraq and Afghanistan. In 2016, during the “knife intifada,” he was stabbed and killed in Tel Aviv by a Hamas terrorist.

 ‘I HAVEN’T gone on Facebook in days. I’m feeling more peaceful than I ever have,’ says the writer.  (credit: DADO RUVIC/REUTERS ILLUSTRATION)
‘I HAVEN’T gone on Facebook in days. I’m feeling more peaceful than I ever have,’ says the writer. (credit: DADO RUVIC/REUTERS ILLUSTRATION)

Like Gonzalez, his relatives sought justice. Shurat Hadin saw social media as a vital tool for terrorists, and brought legal action at a federal court in New York on behalf of five families.

“A bank, for instance, cannot open a bank account to Hamas or Islamic Jihad,” said Darshan-Leitner. “So why should Facebook be allowed to open a page for ISIS or Hezbollah?”

Shurat Hadin argued that Facebook gave material assistance to a known terrorist organization, Hamas, by providing it with a communications platform and recommending Hamas content to users.

The court didn’t side with the terrorism victims and dismissed the case, as with the court of appeals, as with the Supreme Court. Facebook had an impenetrable wall that made it irreproachable.

Robert Tolchin, who was counsel for the petitioners and has worked with Shurat Hadin on anti-terrorism litigation for almost 20 years, said that “the problem that we’ve come up against in all these cases was Section 230.”

Section 230

Section 230 of the US Code Title 47 of the 1996 Communication Decency Act was introduced to promote the development of the Internet in its infancy by protecting Web services from being drowned in lawsuits, and to address concerns about children’s access to inappropriate content.

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” reads the item.

Up until now, this has been interpreted to mean that websites and social media platforms aren’t liable for third-party content produced and shared by users through those mediums. If a user makes a post defaming another person, it is the user, rather than the platform, that is responsible.

Section 230 also permits “good faith” action taken to restrict egregious content by users – it allows selective removal – without the platforms losing their immunity from legal responsibility for content.

The code has been the bane not just of Shurat Hadin’s counterterrorism efforts, but of many in the US who feel that social media platforms have abused their power and are acting in anything but “good faith.”

According to the amicus brief on the Gonzalez case filed by Sen. Ted Cruz, Rep. Mike Johnson and 15 other congresspeople, broad lower court interpretations of Section 230 in cases such as Zeran v. America Online “have conferred near-absolute immunity on Big Tech companies to alter and push harmful content, while simultaneously censoring conservative viewpoints on important political and social matters.”

Sen. Josh Hawley argued in his own brief in support of Gonzalez that the original intention of Congress had not been to impose distributor liability – in which a distributor is liable once it is aware of unlawful content and has failed to take action – but to impose liability for publishers, who have editorial control and knowledge of everything on their publication. Interpretations had blurred the distinction.

States have also been upset by what they see as the federalist displacement of state libel laws. The District of Columbia and 26 US states have filed a brief in support of Gonzalez expressing as much.

Section 230 was not the well-thought-out result of a long hearing, and was introduced when Google, Facebook and Twitter didn’t exist, said Tolchin, who is on the legal team for Gonzalez and Taamneh.

 A 3D printed Facebook's new rebrand logo Meta is seen in front of displayed Google logo in this illustration taken on November 2, 2021.  (credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)
A 3D printed Facebook's new rebrand logo Meta is seen in front of displayed Google logo in this illustration taken on November 2, 2021. (credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)

Shurat Hadin tried again to overcome the impregnable fortress that Section 230 had become, this time advocating on behalf of not Israeli citizens but a Mexican-American family, “because terrorism is not an Israeli problem; it’s a global problem,” said Darshan-Leitner.

Seeking justice

Nohemi Gonzalez was the youngest child in a family of Mexican-American immigrants.

“When she was a little girl, she was very independent. She knew what she wanted to do with her life; she wanted to get an education, and she put all her mind into her goal,” said Beatriz Gonzalez.

While she attended high school, Nohemi attended a community college, and was accepted into California State University, Long Beach, once she finished high school. Pursuing her aspirations as an industrial designer, she soon went to Paris, for a study abroad program.

On November 15, three coordinated terrorist attacks struck Paris, killing Nohemi and 129 other people.

Every year, the school Nohemi attended in Paris has held a memorial for her loss.

“It’s been seven years since she left. But the memories that I have of her are fresh; she’s always going to be with us,” said Beatriz.

Darshan-Leitner said that “when we were put in touch with the Gonzalez family, we agreed to file a case on their behalf, and we decided to sue Google,” the owner of YouTube.

Beatriz told The Jerusalem Post that “many people are suffering, and nobody is stopping these [terrorist] groups from getting together and doing whatever they want on social media.”

The relatives of Nawra Alassaf also suffered as a result of ISIS terrorism. He was killed in the 2017 Reina nightclub shooting in Istanbul. A companion case was brought forward for Alassaf’s relatives. Hundreds of terrorist victims filed briefs in support of the Taamneh family.

While these cases were rejected by lower courts, this time their appeals were accepted by the Supreme Court.

Algorithms and terrorism

The argument for YouTube’s liability for the rise of ISIS and the subsequent death of Nohemi is based on the platform’s recommendations systems, which algorithmically suggest content similar to that liked or regularly watched by users. In its brief, the Counter Extremism Project detailed that these algorithms are built with the idea that “edgy” content is more attention-grabbing. This leads to inundation and the radicalization of users. Petitioners contend that this process was monetized by Google through ad programs, which didn’t take the necessary action to remove the wave of jihadist content it was suggesting.

Recommending content should make YouTube more than a publisher of third-party content, argued Tolchin. “This is more than a billboard; you are guiding people down the rabbit hole with your algorithms. This isn’t someone going to the library and selecting a book; this is a librarian following around and suggesting books.”

In Taamneh, complaints against Twitter, Facebook and YouTube hold that they knew that Islamic State terrorists were using their platforms and didn’t take action. Allowing the use of their platforms for communications and recruitment constitutes material support for terrorists.

  Silhouettes of laptop and mobile device users are seen next to a screen projection of the YouTube logo in this picture illustration taken March 28, 2018.  (credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)
Silhouettes of laptop and mobile device users are seen next to a screen projection of the YouTube logo in this picture illustration taken March 28, 2018. (credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)

“Congress made clear that if you help a terrorist organization and that group commits a terrorist act, you can be held responsible,” said Keith Altman, a member of the Taamneh legal team. “The Taamneh case seeks to hold social media companies liable for their knowing assistance to ISIS.”

THE SOCIAL media giants have argued that terrorist content is already forbidden by their terms of service. While they didn’t have the capacity to review all content, they have employed automatic flagging and personnel to remove as much as possible. Their routine services were being abused. Their platforms are not being linked directly to the attacks, not having knowingly aided or encouraged them.

In Gonzalez, Google argued that YouTube’s videos couldn’t be proven to have radicalized the attackers. Instead, there was a general complaint about its role in ISIS’s rise to prominence.

In Taamneh, the companies argued that they could not be seen as abetting a criminal act that they had no knowledge of or didn’t actively assist.

When it came to algorithms, Google argued they were neutral automated tools, and it was still protected from liability under Section 230. It noted that every lower court had reiterated these protections.

Outcome of breaking the Internet

Prominent Internet companies and interest groups have filed briefs in support of Google, warning of the impact that would arise if Section 230 were to be abolished or restricted.

Social media platforms like Reddit said that Section 230 allows users and moderators to promote, recommend and remove third-party content. Wikimedia argued that its articles are user-generated content only possible through Section 230. Yelp expressed concern about its own recommendation features, and its user-generated reviews, sometimes negative, about businesses. A brief filed by journalists’ groups feared that without Section 230, speech on the Internet would be limited, thereby limiting sources; and breaking news, which is sometimes incorrect, would be disseminated less, as some would be hesitant to share it. The Internet, they argued, would fundamentally change if websites became more liable for the conduct of their users.

Tolchin offered words of caution and calm to the panic about Section 230, saying that “every time you have one of these cases, you have people saying the sky is falling if the court rules one way or another.

“The next thing that will happen, Congress will develop a new law, but the courts shouldn’t be inventing immunity.”

The real outcome of a successful case would be that “some of the ugly things on the Internet might be limited, and that’s a good thing,” he continued.

If Section 230 were repealed, new fronts would open up against online terrorist propaganda, said Darshan-Leitner.

“We have all our cases that were dismissed waiting for this new ruling. Because once the Court will rule that they have no immunity, we can refile the lawsuit against the social media companies. We also have lawsuits which are pending upon the Court’s decision,” Darshan-Leitner said. “Unfortunately, terrorism has no end.”•