YouTube to hire more screeners to seek out offensive video

Share

Last month, the BBC and The Times found paedophiles were posting indecent comments on videos of youngsters, evading discovery through flaws in YouTube's reporting system.

The news from YouTube's CEO, Susan Wojcicki, followed a steady stream of negative press surrounding the site's role in spreading harassing videos, misinformation, hate speech and content that is harmful to children.

Google, which owns YouTube, announced on Monday that next year it would expand its total workforce to more than 10,000 people responsible for reviewing content that could violate its policies.

YouTube said machine learning was helping its human moderators remove almost five times as many videos that they were previously, and that 98% of videos removed for violent extremism are now flagged by algorithms.

Reuters reported last month that several major advertisers - such as Lidl and Mars - had pulled ads from the platform over "clips of scantily clad children".

As the YouTube CEO explained, the company will continue to bolster its staff, which will be tasked with monitoring the video content and expanding "the network of academics, industry groups and subject matter experts" who can help YouTube "better understand emerging issues". We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand's values.

In addition to getting 10,000 Google employees' help, YouTube also plans to conjure up stricter criteria to consult when deciding which channels are eligible for advertising.

Wojcicki said that while Youtube was a "force for creativity, learning and access to information" she had also "seen up-close that there can be another, more troubling, side of Youtube's openness".

"Since June, our trust and safety teams have manually reviewed almost two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future".

She said advances in machine learning meant Youtube could take down almost 70 per cent of violent extremist content within eight hours of it being uploaded and almost half of it within two hours.

Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.

Share