- Over 25 people have reportedly left their third-party jobs moderating content on Facebook for in-house positions in TikTok’s Dublin offices, per a CNBC report.
- One former moderator told the outlet that working for Facebook was a “terrible job” and TikTok seems to offer a better situation, in part because there isn’t as much extreme graphic content yet.
- The report comes as Facebook has faced increasing scrutiny in recent years for the psychological stress experienced by people who sift through and moderate violent content — like hate speech, terrorism, and murder — on its platform.
- Facebook reportedly agreed to pay $52 million in a settlement in early October to California contractors who said they developed mental-health conditions from viewing the kind of toxic content that comes with the job.
- Visit Business Insider’s homepage for more stories.
Over 25 Facebook content moderator contract workers have reportedly left positions at outsourcing companies like Accenture and CPL, which partner with Facebook, for in-house roles at TikTok, per a CNBC report, which analyzed LinkedIn information as part of its findings.
Social media companies have long turned to outsourcing firms for moderating content on their platforms, a move that critics say allows them to exploit workers to cut costs. Content moderators are tasked with sifting through mounds of graphic, violent content, which can be anything from hate speech to murder.
Both TikTok and Facebook have large moderation operations in Dublin, Ireland, where many of the TikTok jobs that the former Facebook moderators are grabbing are located.
TikTok has built a number of so-called “trust and safety hubs” there, and as Facebook continues to face increasing scrutiny over the psychological stress experienced by people moderating its platform, TikTok is reportedly set on growing that workforce.
“If there’s one company that knows how to ruthlessly poach staff from rivals it’s ByteDance,” social media analyst Matthew Brennan told CNBC, referring to the popular videosharing app’s China-based parent company. “They won’t think twice about swapping in to take advantage of Facebook’s difficulties.”
Spokespeople for TikTok and Facebook did not immediately respond to Business Insider’s request for comment.
It’s unclear why the employees specifically left their Facebook jobs for TikTok, CNBC notes, but the news comes as reports mount of the negative impact that Facebook’s violent content has on the people monitoring it. Facebook does require the third-party firms it works with to provide onsite counselors and an around-the-clock hotline for workers to call for support.
Chris Gray, a former Facebook content moderator based in Dublin, told CNBC that TikTok doesn’t go through third-party staffing companies to fill content moderation positions and instead hires in-house and may have a system in place for employees who experience trauma from their work.
A 2019 investigation from The Verge highlighted Facebook’s pitfalls in how its third-party workers that were moderating content suffered severe psychological stress, all while making $15 an hour. In early October, Facebook also reportedly agreed to pay out $52 million in a California settlement to content moderators who say they developed mental-health conditions while monitoring toxic content on its site.
Read more: Facebook is letting its employees work from home until July 2021 due to the pandemic
The pandemic has brought the issue into even sharper focus as contract workers across the industry have been told to return to the office while their corporate colleagues are allowed to stay home and work remotely. Days after workers with Accenture, a firm contracted by Facebook to moderate content on its site, were told to return to work at their Austin office, one of the contractors reportedly tested positive for COVID-19.
Gray and others in the EU have filed a lawsuit against Facebook and CPL that claims the contractors have developed post-traumatic stress from working in these roles. Gray told CNBC that he hopes the case will result in a ruling that shows “Facebook didn’t take care of people and that they have been willfully blind to what was going on.”
You can read the full report on CNBC here.
Powered by WPeMatico