Kate Starbird is a University of Washington professor and the main character of a recent 60 Minutes segment about the so-called spread of misinformation online. Starbird previously worked with the Department of Homeland Security (DHS) to flag purportedly inaccurate social media content for platform moderators, in explicit hopes that the platforms would remove said content. Yet she told CBS News that the real victims of censorship were researchers like herself, who face increasing scrutiny from conservative media and congressional Republicans.
“Are researchers being chilled?” asked 60 Minutes’ Lesley Stahl.
“Absolutely,” Starbird replied.
And yet, she continues to speak up. The National Press Club is hosting an event with her later this month; its purpose is to help equip journalists to counter the spread of “harmful mis- and disinformation, especially during times of crisis.” She will share the spotlight with Tamoa Calzaldilla, the editor in chief of Factchequeado, a group that combats misinformation aimed at Latinos. A PBS writeup of Factchequeado laments that Spanish-speaking immigrants from countries with “recent histories of authoritarianism, socialism, high inflation and election fraud may be more vulnerable to misinformation about those topics.” One wonders why immigrants fleeing inflation and socialist repression would be less informed on these subjects than native-born citizens; perhaps misinformation watchdogs are worried that Hispanic immigrants might simply disagree with the Democratic Party’s position on the extent of these problems?
In any case, the misinformation-fighting industry is growing bigger and bigger. CBS is set to debut a new program, CBS News Confirmed, entirely themed around preventing misinformation. The show will “identify and fight the spread of false stories, conspiracy theories and bad facts,” according to Variety.
Meanwhile, misinformation watchdog groups are pressuring social media companies to do more to combat AI-generated misinformation. Miles Taylor, a former DHS chief of staff—and author of the infamous “I Am Part of the Resistance Inside the Trump Administration” op-ed—who now works for an anti-misinfo tech group, told Axios that insufficient social media moderation was responsible for Donald Trump’s 2016 win.
Thanks to the Twitter Files, it’s now public knowledge that an army of federal bureaucrats pressured tech platforms to censor so-called misinformation related to elections, Hunter Biden, COVID-19, and other subjects. Whether these efforts violated the First Amendment is currently being sorted out by the Supreme Court. But the platforms themselves have clearly grown frustrated with government guidance. Facebook CEO Mark Zuckerberg is openly frustrated with how his company handled the feds, and Elon Musk’s takeover of X—formerly Twitter—was largely motivated by his self-proclaimed desire to resist such censorship.
As a result, “the government isn’t talking to social media companies,” Taylor lamented to Axios. “Many of the social media companies don’t want anything to do with the government—which means novel AI threats could get missed.”
It’s certainly true that bad information and made-up nonsense circulate on social media. But the fundamental problem remains that our new self-appointed fact-checkers have not proven to be free from error themselves. Nina Jankowicz, who was tapped to run the Department of Homeland Security’s (DHS) misinformation task force, incorrectly identified The New York Post‘s Hunter Biden laptop story as Russian disinformation; the Global Disinformation Index, a nonprofit group, incorrectly flagged the lab leak theory of COVID-19 origins as racist and conspiratorial (it is neither, as the Energy Department now believes it is the more plausible explanation); and third-party fact-checking organizations frequently lead social media platforms astray.
Misinformation experts would be welcome to contribute to the marketplace of ideas and set the record straight when they think false narratives are circulating online. But their preferred tactic is to shut down speech by working with the media, nonprofits, and even the federal government to extort social media platforms to bow to their wishes. This isn’t improving the discourse online. It’s certainly not making the internet a better place.
NPR’s Errors
For a good example of what can go wrong when a mainstream outlet decides to reflexively parrot the purportedly expert progressive consensus, see this article about NPR in Bari Weiss’s The Free Press. NPR veteran journalist Uri Berliner has tallied a number of instances over the years in which the publicly funded news organization succumbed to groupthink.
“An open-minded spirit no longer exists within NPR, and now, predictably, we don’t have an audience that reflects America,” he writes.
This Week on Free Media
If you haven’t heard yet, I’m hosting a new show for Reason about what’s happening in the media. (It’s named after this newsletter.) This week, Amber Duke and I discussed The View‘s climate alarmism, MSNBC’s dawning realization that perhaps minimum wage could be contributing to inflation, and whether it’s OK to ask questions about January 6. Watch below:
Worth Watching
Reaction from the Reason staff has been mixed, but I admit that I’m intrigued by the trailer for Joker: Folie à Deux, the upcoming sequel to the 2019 Joker film starring Joaquin Phoenix. I liked—but did not love—Joker, which took a cool concept and failed to expand on it in a sufficiently interesting way. Indeed, the original film’s trailer is arguably better than the movie, and the best thing about Joker was probably the brief “camera test” teaser. We will see if director Todd Phillips is able to pull off something magical with the addition of Lada Gaga as Harley Quinn.
The post Misinformation Watchdogs Keep Failing Upward appeared first on Reason.com.