Business

Facebook moderators are told that singing karaoke might help them cope with filtering graphic, violent content, a worker said. ‘You don’t always feel like singing after you’ve seen someone battered to bits.’

mark zuckerberg facebook
Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC.

  • Facebook content moderators are told singing karaoke might help them cope with their job, one moderator said Wednesday.
  • Moderators sift through graphic content including child exploitation, suicide, and violence.
  • “You don’t always feel like singing, frankly, after you’ve seen someone battered to bits,” Isabella Plunkett said.
  • See more stories on Insider’s business page.

Facebook content moderators are advised to sing karaoke and paint to cope with filtering disturbing, graphic content from the social-media platform all day, a worker told a government committee on Wednesday.

Content moderators trawl through graphic posts, pictures, and videos that users have uploaded onto the site and delete some of them, with the aim of making Facebook more user-friendly.

During a hearing on Facebook’s treatment of subcontracted content moderators in the Irish Parliament, moderator Isabella Plunkett, who works for Covalen, one of Facebook’s largest contractors in Ireland, said Facebook’s support for moderators was well-intentioned, but insufficient for the strain of the job.

“To help us cope, they offer ‘wellness coaches,'” the 26-year-old said. “These people mean well, but they’re not doctors. They suggest karaoke or painting but you don’t always feel like singing, frankly, after you’ve seen someone battered to bits.”

Child exploitation, suicide, and graphic violence are just some of the types of content Plunkett said she sees on a daily basis.

“I have horrible lucid dreams about the things I’ve seen and for months I’ve been taking antidepressants because of this content,” she told the committee.

She separately received a referral to the company doctor, but didn’t hear back about a follow-up appointment, she said.

Read more: Facebook delays meeting with advertisers after Oversight Board kicks Trump ban back to the platform

Facebook has been heavily criticized for its treatment of these workers. Most recently, a Facebook content moderator in Texas reportedly shared a note internally condemning the company for advising workers to do “breathing exercises” after looking at disturbing content.

Like other content moderators, Plunkett wasn’t allowed to speak to her friends and family about her work because she signed a non-disclosure agreement when she started the job, she said.

“You feel alone.”

Moderators are told to limit their exposure to self-harm and child abuse to two hours every day, but that doesn’t happen, Plunkett said, without elaborating.

A Facebook spokesperson told Insider that the company provides support for its content reviewers, “as we recognise that reviewing certain types of content can sometimes be hard.”

Content moderators have “in-depth training” and access to psychological support for their wellbeing, the spokesperson said.

“We are also employing technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue, and we are committed to getting this right,” they added.

A Covalen spokesperson told Insider in a statement: “We value our employees and the vital work they do. Content moderation is critical in keeping our online communities safe but we know that reviewing certain types of content on social media platforms can be difficult.”

They said Covalen provides a range of well-being measures for moderators, including 24-hour support and supervision, wellness coaching by “highly-qualified professionals,” and enhanced training on trauma, stress management, and personal resilience.

Plunkett said that her job as a moderator was to “train the algorithm” and pick up particular hate speech and graphic videos so that one day a machine will be able to do it, rather than humans.

CEO Mark Zuckerberg said in 2019 that some of the stories of Facebook content moderators struggling to cope with the daily work were “a little overdramatic.” Zuckerberg made the comments during a company-wide meeting in the summer of that year, in response to an employee question on news reports featuring traumatized content moderators. The audio was obtained by The Verge.

Read the original article on Business Insider

Powered by WPeMatico

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.