Google’s YouTube has come under a bit of criticism this year regarding videos that have appeared on its platform. Some of these videos have included things like hate speech, extremists and more, Google now has a new plan to deal with these.
The company has announced that it intends to employee 10,000 new staff to moderate videos on YouTube in 2018.
“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future. We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether. In the last few weeks we’ve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.
We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.”
You can find out more details about Google’s plans to get rid of questionable content from YouTube at the link below.