A little over a month ago, Google announced several steps it would be taking to combat extremist content on YouTube.
According to its latest update, these efforts are paying off. For starters, its machine learning systems are getting better at identifying extremist videos. More than 75% of videos removed over the past month were taken down before getting flagged by a human.
This speed is necessary considering more than 400 hours of content are uploaded to YouTube every minute.
And accuracy is improving: "While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed," the YouTube team wrote in a blog post.
Today's announcement from YouTube comes at the same time as U.K. Home Secretary Amber Rudd's visit to Silicon Valley. Rudd is visiting California to warn tech giants that the U.K. could introduce laws to clamp down on extremist content.
The technology used is reinforced by the involvement of more human experts. Google has on-boarded 15 of the 50 expert NGOs and institutions that were promised in its initial announcement in June to its Trusted Flagger program.
YouTube also plans to be more strict in the coming weeks with videos that "aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism."
Videos that don't directly violate policies but do contain controversial religious or supremacist content will be placed in a "limited state."
"The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes," the blog post said.
This new treatment will be implemented in the coming weeks on desktop versions of YouTube and in mobile soon after.
YouTube has also progressed in its plans to take an active stance in countering terrorism. The platform has started rolling out features from Jigsaw’s Redirect Method to YouTube so when users search using sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that "directly confront and debunk violent extremist messages if they do not take action themselves."
It's also taking activism on the road. Last week, the U.K. chapter of its YouTube Creators for Change program hosted a two-day workshop for 13- to 18-year-olds aimed at helping them learn how to participate safely and responsibly on the internet.
YouTube has pledged to expand the program’s reach to 20,000 more teens across the U.K.
This story first appeared on campaignlive.co.uk.