Google Hiring Thousands of Moderators To Clean Up YouTube

Adjust Comment Print

YouTube addressed the issue this week, saying it has reviewed almost 2 million videos for violent, extreme content and removed more than 150,000 of those videos since June - largely with the help of its "machine-learning technology" that can identify problematic videos.

Wojcicki said the company would take "aggressive action" by launching new comment moderation tools.

For instance, Wojcicki says that YouTube will have 10,000 people working to address questionable content by 2018 - noting that human reviewers are essential to training the company's machine learning systems.

YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions. Several reports stated that advertisers were not comfortable in placing their ads on YouTube thereafter.

Finally, YouTube promises to be a lot more transparent.

"Our advances in machine learning let us now take down almost 70 percent of violent extremist content within eight hours of upload and almost half of it in two hours and we continue to accelerate that speed", Wojcicki noted.

National Football League suspends Patriots' Rob Gronkowski one game for illegal hit
Bill Belichick also apologized to Bills coach Sean McDermott in their post-game exchange, calling it "a BS play". I've been a part of plenty of games, whatever sport it might be. "He's [Gronkowski] a better player than that".

A Times investigation last month revealed that Youtube videos showing "scantily clad children" had attracted comments from hundreds of paedophiles, in some cases encouraging them to perform sex acts.

On December 4, Alphabet-owned YouTube announced to expand its team of reviewers to manage extremist, violent content by 2018. Some 250 advertisers earlier this year also said they would boycott YouTube because of extremist videos that promoted hate and violence. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors.

Wojcicki says YouTube will also use technology to flag "problematic" videos or comments that show hate speech or harm to children.

The technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess, according to Wojcicki.

"We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetise videos by mistake".