Google has outlined four steps it is taking to fight against the spread of terrorist-related content on You Tube, following criticism from politicians that internet firms are not dealing with extremist content quick enough.
“Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all,” Kent Walker, general counsel at Google wrote in a blog post. “There should be no place for terrorist content on our services.”
Step one is putting more engineering resource into developing artificial intelligence software that can be trained to identify and remove extremist content.
Step two, CNBC reported, is expanding the number of independent experts in YouTube’s Trusted Flagger program as their reports are accurate over 90% of the time.
Step three will be taking a tougher stance against videos that do not clearly violate YouTube rules, putting warning signs on videos that have inflammatory religious or supremacist content and those videos will not be monetized, recommended or eligible for user comments.
Step four: YouTube is working with Jigsaw company which uses ad targeting to send potential ISIS recruits to anti-terrorist videos which could change their mind about joining extremist organizations.