YouTube limits moderators to viewing four hours of disturbing content per day

Trump’s ‘Space Force’ sounds a lot like the Space Corps his administration didn’t want
March 13, 2018
Pioneering physicist Stephen Hawking dies at 76
March 14, 2018



YouTube executive director Susan Wojcicki said today that the video platform has begun to limit the number of hours that its part-time moderators can watch disturbing videos at four hours per day. The news, announced during a question and answer session during Wojcicki's South by Southwest interactive talk here in Austin, comes at a time when companies like YouTube are struggling to analyze the huge volume of content uploaded by users and make sure it complies. with its policies. Platforms that include YouTube, Facebook, Reddit and Twitter have been criticized for subjecting poorly paid contractors to content that can be extremely disruptive.
"This is a real problem and I myself have spent a lot of time looking at this content over the past year, it's really difficult," Wojcicki said of the moderation of the content. Recent solutions in which the company has entered include both limiting the daily hours contractors do this work and providing what Wojcicki called "welfare benefits."
This is a real problem and I myself have spent a lot of time looking at this content
Federal laws that exempt technology companies from hosting illegal content still require that they remove illegal videos from the platform using a combination of human and algorithmic moderation. YouTube implements a system known as Content ID to identify and eliminate direct infractions involving copyrighted television, movies and music. But for videos showing violence, murder, suicide and other disturbing issues, YouTube employs part-time human moderators to physically confirm the content of the videos. These people are often hired as contractors and do not have the same access to mental health benefits as full-time Google employees. It is not clear what types of psychological counseling YouTube part-time contractors receive under Wojcicki's definition of "welfare benefits".
This problem may intensify in the coming months, as YouTube has committed to hiring 10,000 people to address the limitations of its algorithms, a process that Wojcicki said today was ongoing. The company has been under fire for the past 18 months for failing to address an increase in conspiracy theories, propaganda, false news and religious radicalization videos on its platfomr. The firm response from YouTube led to a series of advertising boycotts, pushing Wojcicki and his team to start taking the issue a lot more seriously by employing more human moderators.

That's why I can not stop receiving calls from Facebook, YouTube, etc. to increase moderation that does not consider the human cost of exposing workers with lower wages to extremely tedious traumas and jobs. https://t.co/YkFEEAoAEa- Adrian Chen (@AdrianChen) March 13, 2018

Meanwhile, YouTube remains prey to controversy over its failure to detect obvious transgressions. This week, critics pointed out that Infowars Conspiracy Theory videos of the Austin bombings reached the top of YouTube's search results, something the company said it could not immediately explain.
Now, it seems that YouTube is dealing on a separate front with the psychological cost that its solution to these problems has for part-time workers. Wojcicki acknowledged that even four hours a day is still a lot. We have communicated with YouTube to clarify how many hours before your content moderators were asked to watch these types of videos, and we will update the story when we receive a response.

ICS
ICS

Leave a Reply

Your email address will not be published. Required fields are marked *

I accept that my given data and my IP address is sent to a server in the USA only for the purpose of spam prevention through the Akismet program.More information on Akismet and GDPR.