Google And YouTube Set New Policies To Take On Extremist Videos

Google And YouTube Set New Policies To Take On Extremist Videos
Dsndrn-Video / Pixabay

YouTube parent company Google is taking a tough stance on extremist videos and content. It is now working to quickly identify such videos and bury them. Google’s new measures come weeks after the deadly terror attack in London, after which British Prime Minister Theresa May called for new regulations on Internet companies.


New policies to curb extremist videos on YouTube

According to media reports, one of the London Bridge attackers became radicalized after watching YouTube videos of an Islamic preacher. YouTube has a complex set of rules and regulations which sometimes lead to loopholes in framing the policy. Due to a weak structure, such contents are not filtered out as inappropriate, explains The New York Times.

How Fund Managers And Investors Are Investing And Implementing ESG

investIt's no secret that ESG (environmental, social, governance) factors have become more important in investing. Fund managers are increasingly incorporating ESG factors into their portfolio allocations. However, those that don't are in danger of being left behind as investors increasingly avoid allocating with funds that don't incorporate ESG into their allocations. Q3 2021 hedge fund Read More

Many have also accused the video streaming company of raking in money by not restricting such hateful content. Previously, there were incidents of ads from prominent brands being shown beside extremist content on the platform.

Now, however, Google is ready to take the appropriate steps in the fight against terrorism. The new policies will enable quick detection, stricter norms, more experts and expansion of counter-radicalization work. Further, the search giant will make it difficult for videos with an extremist tone to be found in searches. Such videos will also come with a warning and will not be eligible for user comments or ad revenue.

Google will use “more engineering resources to apply our most advanced machine learning research to train new ‘content classifiers’ to help us more quickly identify and remove such content.” Further, the YouTube flagger program will be extended with 50 more experts from non-governmental organizations for the 63 groups which are already working under the program. Google will fund these groups.

A spokeswoman for the company stated that they are working with Jigsaw to implement their “redirect method” and expand the company’s reach throughout Europe. The redirect method will be aided by targeted online advertising to identify potential ISIS recruits and redirect them towards anti-terrorist videos that will be helpful in changing their mind about joining. The spokeswoman added that Jigsaw’s “redirect method” is already in use in the United States, notes CNET.

Balancing free expression and access to information

Google also said it is working with other big tech firms like Facebook, Microsoft and Twitter to develop technology to curtail hate content on the Internet and support small companies to boost their joint efforts in tackling terrorism online, notes the Telegraph.

“We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” Google General Counsel and Senior Vice President Kent Walker said in a blog post on Sunday.

Tech companies are facing immense pressure globally to ramp up their policies and make stringent rules which are strong enough to filter out extremist videos and ideologies online with accuracy. After the most recent London terror attack, such voices have become even louder than before.

Recently, British Prime Minister Theresa May said, “We cannot allow this ideology the safe space it needs to breed – yet that is precisely what the internet, and the big companies that provide internet-based services provide.”

Updated on

No posts to display