A group of more than 80 prominent fact-checking organizations around the world is pressing YouTube to take action against COVID misinformation, which still prevails on the platform now two years into the pandemic.
“As an international network of fact-checking organizations, we monitor how lies spread online — and every day, we see that YouTube is one of the major conduits of online disinformation and misinformation worldwide,” the coalition of fact checkers wrote in an open letter published on Poynter. “This is a significant concern among our global fact-checking community.”
The collection of fact-checking organizations that signed the letter spans the globe, including U.S.-based groups like PolitiFact, The Washington Post Fact Checker and Poynter’s MediaWise, alongside Africa’s Dubawa and Africa Check, India’s Fact Crescendo and Factly and many more organizations from countries including Indonesia, Israel and Turkey.
The group notes that health-related misinformation has long found fertile ground on the video-sharing site, including content encouraging cancer patients to fight their conditions with unscientific treatments.
“In the last year, we have seen conspiracy groups thriving and collaborating across borders, including an international movement that started in Germany, jumped to Spain and spread through Latin America, all on YouTube,” the letter states. “Meanwhile, millions of other users were watching videos in Greek and Arabic that encouraged them to boycott vaccinations or treat their COVID-19 infections with bogus cures.”
The letter also highlights the particular dangers of misinformation spreading in non-English language videos. Facebook whistleblower Frances Haugen called attention to parallel concerns on that platform, which also does not invest evenly in content moderation outside of English-speaking countries. The fact-checking group encourages YouTube to “provide country- and language-specific data, as well as transcription services that work in any language” to push back against the flow of misinformation in languages beyond English, which the company focuses its moderation methods on.
The fact-checkers don’t just present problems — they pose solutions as well, pointing out that the company should create far more transparency around its misinformation and disinformation policies and support independent researchers who specialize in those issues. The group also urges YouTube to step up its efforts to debunk misinformation and provide immediate context on-platform, two tactics that could be accomplished by deepening its work with fact-checking organizations.
While Facebook and Twitter have long faced intense public scrutiny for the spread of misinformation on their platforms, YouTube often manages to fly under the radar. Its recommendation algorithm has played an active role in promoting dangerous claims in recent years, but because like TikTok the platform is video and not text-based it’s generally more difficult for researchers to study and lawmakers holding tech accountability hearings to wrap their heads around.
“YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves,” the group wrote. “Current measures are proving insufficient.”
Powered by WPeMatico