Facebook is Testing an “Extremism Detector” Function

Updated on

Facebook Inc (NASDAQ:FB) users in the U.S. are getting a prompt that asks them whether any of their contacts are suspects of becoming “extremists,” or if they themselves have been exposed to extremist content.

Get The Full Walter Schloss Series in PDF

Get the entire 10-part series on Walter Schloss in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.

Q1 2021 hedge fund letters, conferences and more

Facebook tackling online extremism and abuse

Users of the platform took to Twitter to show the prompts, “Are you concerned that someone you know is becoming an extremist?” and “You may have been exposed to harmful extremist content recently.” They both link to a support center.

The Facebook support page titled “What can I do to prevent radicalization” links users to Life After Hate by ExitUSA, which the company explains helps people find “a way out of hate and violence.” Also, the support page offers advice on engaging with anyone trying to leave a hate group.

According to The Verge, this is part of a test run by Facebook to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk.

“We are partnering with NGOs and academic experts in this space and hope to have more to share in the future,” spokesperson Andy Stone told CNN.

A difficult task

According to The Verge, Facebook has not been efficient at tracking down and removing hate content, “even from groups that it has actively tried to kick off the platform.”

The company has been in the hot seat several times in the last years, facing strong examination from critics who argue that it has failed to take enough action against extremist groups on its platform.

One of the most recent was in 2020 when Facebook was late to delete the group page of a militia organization in Wisconsin, which instigated armed citizens to take to the streets. Back then, several users had reported the account on the grounds of violent incitation, but the company responded that it didn’t violate any rules.

“Facebook later acknowledged that while it removed the militia page, it failed to remove the event being promoted by the group,” CNN reported.

Still, this this not a Facebook problem only as various other platforms are still fighting to track down and address users who abuse women online, or depict traces of toxic behavior. Just yesterday, Google ––Alphabet Inc (NASDAQ:GOOGL)–– TikTok and Twitter Inc (NYSE:TWTR) committed to blocking abuse of women online by studying new features “to help women better curate their online experiences and to improve reporting features,” as reported by CNET.

Facebook is part of the Entrepreneur Index, which tracks 60 of the largest publicly traded companies managed by their founders or their founders' families.