The Supreme Court of the United States today launched a judicial course loaded with important cases that affect positive discrimination in access to universities, the environment, electoral rules and discrimination against homosexuals, among many other issues. At the time that the oral hearings began, which are resumed in person for the first time since the pandemic broke out, the Supreme Court has announced that it admits new cases. Among them, two stand out that will measure the responsibility that large technology companies have in the content published by their users on social networks.
Google, Facebook, Twitter, Amazon and other companies with social networks are thus experiencing a great judicial test on the moderation of their content, an issue that has sparked controversy and is being subject to different regulations in the States. One of the admitted cases Gonzalez, Reynaldo and others against Google, will examine to what extent Google can be held responsible for the Mataclan massacre in Paris for having allowed the dissemination of videos inciting Islamist violence on its YouTube platform. The other affects Twitter, Google and Facebook in relation to the attack on a nightclub in Istanbul in 2017, with 39 deaths.
Those suing Google over YouTube are the relatives of Nohemi Gonzalez, a 23-year-old American university student who was one of the 131 people killed by Islamic State terrorists in a series of attacks that shocked Paris on November 13, 2015, in the Bataclan concert hall and other places in the French capital. Gonzalez was killed in a restaurant where she was dining that day. The lower courts have rejected the claim, but the family has appealed to the Supreme Court, which now agrees to take over the case.
US law states that Internet companies are not responsible for the content posted by their users, but the issue has become controversial for various reasons. Several perpetrators of multiple murders have broadcast their actions live. On the other hand, the contents of the networks have become the object of political propaganda. While the Democrats denounce the extreme right and conspiracy propaganda that is spread on the networks, the Republicans complain about the content moderation policy practiced by some Big Tech and that they consider censorship.
Nohemi Gonzalez’s family criticizes that YouTube is not limited to a passive role in allowing users to search for what to see, but that its algorithm recommends videos based on the history of each user. With this, those who watched videos of Islamist propaganda received more content of this type, facilitating their radicalization. They complain that the Google group company, whose parent company is now Alphabet, allowed the dissemination of radical propaganda videos that incited violence.
“If Section 230 [la norma que en principio descarga de responsabilidad a las companias por los contenidos de sus usuarios] applied to these algorithmically generated recommendations is of enormous practical importance,” the family argued in the resource. “Interactive computer services consistently direct those recommendations, in one form or another, to virtually every adult and child in the United States who uses social media.” The victim’s family believes that Google violated the anti-terrorism law by allowing the dissemination of these videos.
Google counters that the only link between the Paris attacker and YouTube was that one of the attackers was an active user of the platform and once appeared in an ISIS propaganda video. “This court should not take lightly a reading of section 230 that threatens the basic organizational decisions of the modern internet,” Google argues.
In the other case, lower courts have held that Twitter, Facebook and Google should take some responsibility for the content spread in connection with the massacre at an Istanbul nightclub, the Reina club, at a New Year’s Eve party in 2016. The case , Twitter and others vs. Taamneh, Mehier and others, It has also been admitted by the Supreme Court.
Both cases will be a first pulse in a battle that extends over the immunity that companies should or should not have regarding the content of their users and, at the same time, about the margin they have for their moderation policy.
The court issues a resolution on the cases it admits each year within a period that extends until the end of June, beginning of July, when the judicial course ends.
You can follow THE COUNTRY TECHNOLOGY in Facebook Y Twitter or sign up here to receive our weekly newsletter.
Source: EL PAIS