YouTube Now Redirects Potential ISIS Recruits To Anti-Terrorist Video

Updated on

YouTube wants to steer clear of the accusations made against big tech companies for not making any effort to curb extremist videos. To do this, the company is taking help from Jigsaw, Google’s in-house think tank, to identify and target ISIS-related propaganda on YouTube.

How YouTube’s Redirect Method works

Terrorist groups have long used YouTube to post extremist videos aimed at radicalizing and recruiting potential targets. To end this, the company has come up with a new project called the Redirect Method, which will help diverting people’s attention from violent extremist propaganda. Under the new technology, when a potential ISIS recruit searches for extremist content, he will be directed to videos that work toward deconstructing terrorist groups.

In a blog post, the company stated that the new method will drive “people away from violent extremist propaganda and steer them toward video content that confronts extremist messages and debunks its mythology.”

YouTube’s Redirect Method website says, “It focuses on the slice of ISIS’ audience that is most susceptible to its messaging, and redirects them towards curated YouTube videos debunking ISIS recruiting themes.”

Further, the website says the method, which was developed after interviews with ISIS defectors, does not infringe upon the privacy of a user, and it can also be used to handle various other types of violent recruiting discourses online.

In addition, the company stated that in the next few weeks YouTube’s Redirect Method will be expanded to a wider set of search queries in non-English languages with the help of machine learning. YouTube will also work in collaboration with non-governmental organizations to develop new content designed to handle violent extremist messaging “at different parts of the radicalization funnel.”

Other measures to fight extremist content

Google and YouTube woke up to the issue when a large number of advertisers started pulling out after finding that their advertisements were placed alongside extremist content. Thereafter, YouTube established that over 10,000 views will be required for creators to earn revenue on their videos.  However, the outcry three months later was even bigger when one of the three attackers in the London Bridge incident was found to have been influenced by extremist videos on YouTube.

Apart from this latest initiative, Google and YouTube have taken several other measures previously to limit such content on their platforms. The company is using technologies such as video analysis models to identify and scrap terrorism-related content. Further, the company is working toward increasing the number of independent experts in YouTube’s Trusted Flagger program. The company is also committed to demonetizing and disabling comments on terrorism-related videos, blocking recommendations on them, and showing a warning in front of “videos that do not clearly violate our policies” but include offending content.

Referring to such measures last month, Google General Counsel Kent Walker said, “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”

Leave a Comment