Select Page

Google says it will have more than 10,000 members of staff monitoring content on YouTube next year.

YouTube has been criticised for failing to adequately safeguard children and for allowing extremist content, including Islamic terror-related and white supremacist videos, to be shared.

Although Google, which is YouTube’s parent company, employs machine learning algorithms to automatically flag videos which may breach its rules, the ultimate decision to remove content is made by humans.

In a statement from YouTube’s chief executive, Susan Wojcicki, the company claimed to have reviewed almost two million videos and removed 150,000 since June.

In August, YouTube was criticised for deleting video evidence relating to potential war crimes in Syria as part of its work to remove terrorist content and propaganda from the platform.

A range of private and public sector organisations suspended their advertisements from YouTube in March amid concerns they were appearing beside inappropriate content.

Ms Wojcicki said she has seen how YouTube’s open platform “has been a force for creativity, learning and access to information” and been used by activists to “advocate for social change, mobilise protests, and document war crimes”.

More from Tech

“I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm,” she warned.

According to the statement, 98% of the videos that YouTube removes for violent extremism are flagged by its machine-learning algorithms, and nearly 70% of these are removed within eight hours of upload.

World News – Breaking international news and headlines | Sky News

Spain withdraws Puigdemont arrest warrant
Spanish judge withdraws Puigdemont warrant
Show Buttons
Hide Buttons
More in World
Spanish judge withdraws Puigdemont warrant

Close