Facebook Fake News Detector Turns Out To Be A Plugin [REPORT]

Updated on

What many initially believed to be a Facebook tool for sticking labels to fake news articles in users‘ news feed turns out to be an implementation of the Chrome plugin B.S. Detector, according to TechCrunch. Previously, TechCrunch only reported that the social networking giant was testing a tool like this, but later it updated the story.

Warning labels on news articles

TechCrunch had reported that Facebook was conducting this test secretively and had taken a small sample size from its 1.79 billion users. This was the first indication that the company was working on its promise to curb fake news on its platform.

TechCrunch said that one such label slapped by Facebook read, “This website is not a reliable news source. Reason: Classification Pending.” In other cases, “state-sponsored news” is listed as the reason.”

According to TechCrunch, one story that was flagged came from occupydemocrats.com and read that veterans are forming a “human shield” around protesters at Standing Rock. Further, the technology news website reported that the story shared by occupydemocrats.com is supposedly true, as it was also picked by The New York Times, Reuters and other reliable sources.

Facebook still to take action

The fake test tool has now been running for almost a week, and some users have reported that it is vanishing from some feeds as quickly as it appeared. The reason why many thought the implementation of the Chrome plugin B.S. Detector was a Facebook feature is that the testing tool and pattern of the test is in line with Facebook’s way of doing tests over a small sample. So Facebook still has to come up with a plan to curb fake news.

Last month, CEO Mark Zuckerberg laid down guidelines to combat misinformation. Out of the numerous steps discussed, labeling misinformation was one. Also the company is looking to collaborate with various fact-checking groups to simplify reporting misinformation for users, notes USA Today. The company is also building technology to classify information.

Earlier, Zuckerberg argued that Facebook had nothing to do with swaying public opinion in the U.S. election, resulting in a win for Donald Trump.

“We take misinformation seriously,” Zuckerberg wrote. “We know people want accurate information.”

Initially, the company did not accept the fact that there is fake news on its platform misleading users. However, later it succumbed to media pressure, claiming it would take strong steps to curb the spread of fake news.

Leave a Comment