The Hazards of Holding YouTube Liable for Promoting Terrorism

Every day, people around the world post about 720,000 hours of new content on YouTube—500 hours of video every minute. That enormous volume of material poses challenges for the platform, which aspires to enforce rules against certain kinds of content, and for its users, who cannot hope to navigate the site without help from YouTube’s algorithms, which facilitate searches and recommend videos based on personal viewing patterns.

Those challenges underlie a case that the Supreme Court will hear next month when it will consider whether Google, which owns YouTube, can be sued for helping the terrorist group ISIS promote its message and attract followers. The case illustrates the hazards of increased civil liability for social media companies, which critics on the right and the left wrongly see as the key to better moderation practices.

Since 1996, federal law has shielded websites from most kinds of civil liability for content posted by users. Under 47 USC 230, “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230 also protects “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” These two kinds of immunity aim to avoid potentially crippling litigation that would impede the availability of user-generated information and deter content moderation, making the internet as we know it impossible.

In 2021, the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 barred a lawsuit against Google by the family of Nohemi Gonzalez, a 23-year-old U.S. citizen who was killed in a 2015 ISIS attack while studying in Paris. The plaintiffs originally argued that Google was liable under the Anti-Terrorism Act for allowing ISIS videos to remain on YouTube and for increasing exposure to them through its “up next” feature, which suggests videos similar to ones users have watched.

On appeal to the Supreme Court, Gonzalez’s family concedes that Section 230 means Google, which bans YouTube videos “intended to praise, promote, or aid violent extremist or criminal organizations,” cannot be sued for failing to fully enforce that policy. But the plaintiffs argue that the company can be sued for pointing users to such videos when they view similar content, and the Biden administration agrees.

In response, Google notes that “algorithmic tools—from search rankings and content recommendations to email spam-filtering—are indispensable to a functional internet.” Google argues that there is no defensible distinction between YouTube’s “up next” feature and other algorithms that enable internet users to sort through an “unimaginably vast” amount of material to find relevant and useful information.

If YouTube’s “algorithmic tools” expose Google to liability for content it did not create, in other words, every provider of an “interactive computer service” will have to worry about the legal risk of guiding users through a massive morass of material that would otherwise be unmanageable. This is just one facet of a broader problem with making it easier to sue websites over third-party content.

President Joe Biden thinks repealing Section 230 would make it possible to “hold social media platforms accountable for spreading hate and fueling violence.” Republican politicians like Sen. Roger Wicker (R‒Miss.), meanwhile, complain that Section 230 allows those platforms to discriminate against conservatives with impunity.

The fact that two sets of critics blame Section 230 for either too little or too much content moderation suggests something is wrong with their reasoning. In reality, the First Amendment protects both “hate speech” and editorial discretion.

Repealing Section 230 would not change that. But the resulting litigation would force platforms, especially those without the resources to battle a flood of lawsuits, to choose between much more heavy-handed content moderation and none at all—a situation that neither Biden nor Wicker would welcome.

© Copyright 2023 by Creators Syndicate Inc.

The post The Hazards of Holding YouTube Liable for Promoting Terrorism appeared first on Reason.com.