YouTube hiring thousands to monitor abuse videos

Youtube on smartphone

Global IT giant, Google, is hiring thousands of moderators to detect extremist content on its video-sharing service YouTube, as well as other materials violating its rules, YouTube CEO, Susan Wojcicki, has said.

In June, Google announced additional measures to counter the spread of extremist data via YouTube.

The company voiced plans to widen the use of technologies to identify extremist- and terrorist-related videos, to attract more experts to its programme of identifying problematic videos, toughening the rules as for the content that did not clearly violate YouTube’s rules and to expand its role in struggle against radical movements.

“Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.

“We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018,” Wojcicki said in an article published on the official blog of YouTube.

The official added that the company was also fighting against aggression in comments, as well as cooperating with a number of child safety groups, such as the National Center for Missing and Exploited Children, in order to ensure fight against predatory behaviour.

According to Wojcicki, YouTube has also exerted efforts to increase the “network of academics, industry groups and subject matter experts” that teach the company’s specialists to better respond to the developments in the world and on the Internet.

There has been outcry that Google was allowing violent content to slip past the YouTube Kids filter, which is supposed to block any content that is not appropriate for young users. Some parents recently discovered that YouTube Kids was allowing children to see videos with familiar characters in violent or lewd scenarios, along with nursery rhymes mixed with disturbing imagery, according to the New York Times.

Other reports uncovered “verified” channels featuring child exploitation videos, including viral footage of screaming children being mock-tortured and webcams of young girls in revealing clothing.

YouTube has also repeatedly sparked outrage for its role in perpetuating misinformation and harassing videos in the wake of mass shootings and other national tragedies.

Some parents of people killed in high-profile shootings have spent countless hours trying to report abusive videos about their deceased children and have repeatedly called on Google to hire more moderators and to better enforce its policies.

[youtube https://www.youtube.com/watch?v=dK5Q8j5Vkl4]