FCC Chair Brendan Carr Wants More Control Over Social Media

OSTN Staff

In his short time as chairman of the Federal Communications Commission (FCC), Brendan Carr has been no stranger to using his power against disfavored entities. The chairman’s targets have primarily included broadcast networks and social media companies.

Recently, Carr revealed a fundamental misunderstanding about one of the most important laws governing the internet and social media.

On February 27, digital news outlet Semafor held a summit in Washington, D.C., titled “Innovating to Restore Trust in News,” which culminated in a conversation between Semafor editor-in-chief Ben Smith and Carr.

“The social media companies got more power over more speech than any institution in history” in recent years, Carr told Smith. “And I think they’re abusing that power. I think it’s appropriate for the FCC to say, let’s take another look at Section 230.”

Section 230 of the Communications Act effectively protects websites and platforms from civil liability for content posted by others. It also protects a platform’s decision to moderate content it finds “objectionable, whether or not such material is constitutionally protected.”

Like many conservatives, Carr looks askance at social media’s latitude to moderate content with what he perceives as impunity. “The FCC should issue an order that interprets Section 230 in a way that eliminates the expansive, non-textual immunities that courts have read into the statute” and “remind courts how the various portions of Section 230 operate,” he wrote in a chapter of The Heritage Foundation’s Mandate for Leadership, more popularly known as Project 2025.

As Reason noted earlier this month, Carr is mistaken. The Supreme Court ruled last year in Loper Bright Enterprises v. Raimondo that government agencies like the FCC do not have the authority to “clarify” or “interpret” statutes as they see fit; Congress and the courts share that job.

Smith pressed Carr on this point. “The Supreme Court has recently ruled pretty strongly,” he noted, “that regulatory agencies are not allowed to go rooting around regulation, looking for new mandates to go dive into the private sector and enforce things on them. And I think a lot of people think your Section 230 aspirations are going to ultimately hit a legal wall, and it just feels like you’re really interested in expanding the power of government in a way that feels new.”

The comparison is “apples and oranges,” Carr replied. Unlike regulatory endeavors undertaken by previous FCC regimes, “social media content moderation is regulated by Congress through Section 230, and so there’s a question of how should that thing apply to social media.”

This, too, is completely backward. “Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit,” according to the Electronic Frontier Foundation (EFF). “This reinforces the First Amendment’s protections for publishers to decide what content they will distribute.”

“The statute was never intended to be a legal ‘stick’ to impose conditions on providers’ behavior,” adds the Wikimedia Foundation, the nonprofit that hosts Wikipedia. “Instead it offers a legal ‘carrot’ to providers who do indeed take steps to moderate content by shielding them from liability for their moderation efforts.”

“Section 230 grants complete immunity for publisher or speaker activities regardless of whether the challenged speech is unlawful,” according to a February 2024 report from the Congressional Research Service. “In contrast, the First Amendment requires an inquiry into whether the challenged speech is constitutionally protected and may provide limited or no immunity for certain activities.”

This was the explicit intent of the law: to encourage platforms to moderate by allowing them the freedom to make moderation decisions without having to go to court to justify them or fearing that a poor decision could be held against them.

In fact, in the absence of Section 230—or even if it were merely weakened—platforms could be subject to lawsuits if they remove any content, even though choosing what content to allow on your privately-owned website is protected by the First Amendment. Giants like Facebook and YouTube could likely weather that kind of onslaught, but smaller competitors could be driven out of business or otherwise made useless.

Without Section 230, Yelp—the online platform where users rate and review businesses—would be vulnerable to lawsuits from business owners who disagreed with negative reviews. “A user who believes a review violates our content guidelines can flag it for removal—this includes reviews that a final court of competent jurisdiction has deemed to be defamatory,” Yelp general counsel Aaron Schur told the EFF. “We do not take sides in factual disputes, however, so we do not remove reviews that appear to reflect the experiences of the reviewer….[Section] 230 is pivotal to our business in this regard: reviews are the responsibility of the people who write them, not the platform that hosts them.”

“Absent [Section] 230, websites like Yelp would be pressured to avoid liability by removing legitimate, negative reviews, and they would deprive consumers of information about the experiences of others,” Schur added.

If Section 230 were weakened or revised, it would empower the government to wield control over internet platforms by opening them up to lawsuits. Smith said as much to Carr, marveling that Carr seemed “so eager to get a government agency into the business of these private companies.”

“I’m eager to apply the law as passed by Congress,” Carr retorted. But if he were truly eager to apply Section 230 as Congress passed it, the best way to do so would be to keep his hands off.

The post FCC Chair Brendan Carr Wants More Control Over Social Media appeared first on Reason.com.