Facebook Returns Phrase “Video Of Child Abuse” In Search Suggestions

Updated on

While intelligent search algorithms are useful and are a large part of the way we efficiently access information online, sometimes they lack the common sense that a human would provide. Facebook has found itself in hot water after their search bar suggested results like “video of girl sucking d*ck under water” and “video of little girl sucking.”

The “Video Of” Conundrum

The social media giant has since come out and apologized for the error, and said that they are currently investigating the matter.

The new initially broke on Thursday, as multiple users posted images to twitter showing the inappropriate results that showed up after users entered the words “video of” into the Facebook search bar. The phrases mentioned above are just a couple of the phrases listed, with other options also suggested such as “video of shooting in Florida,” “video of sexing,” and “video of child abuse.”

Although Facebook definitely shoulders some of the blame when it comes to these offensive “video of” suggestions, it turns out that the users may be more to blame. Facebook is as successful as it is based off of algorithms. The posts you see from your friends and the advertisements included are all adjusted in the way that Facebook believes will return the best results. While the search results aren’t suggesting that you personally are searching for a video of child abuse, the results are based on what others have searched previously – a function that is normally useful but can turn up some unsavory results.

Facebook has also come forward and said that the suggestions are based on what people are searching but that there isn’t necessary a video of such offensive material on the platform. The fact that people are searching for a video of child abuse is concerning, but this appears to be a case of Facebook’s algorithm working exactly as intended – with a rather unpleasant outcome.

https://twitter.com/hoss_boss1/status/974480188137287680

The company has since removed the offensive predictions, in a quick response to a serious issue. It’s clear that Facebook takes both the reputation of their platform as well as the content it shows to users quite seriously.

A Problematic Survey

However, this isn’t the first time that Facebook had made the news for a child abuse related scandal. A survey earlier this month, while not necessarily related to a video of children, asked participants how they felt about the platform potentially allowing child grooming content. This survey sparked a ton of outrage in the community and from politicians who worried for the safety of children using social media.

Facebook has since apologized for the survey, with company spokesperson saying, “We sometimes ask for feedback from people about our community standards and the types of content they would find most concerning on Facebook…We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey.”

While it may be normal for Facebook to revise their community standards at certain points in time, the fact that a question on a survey would propose that the sharing of a video of child grooming would in any way be appropriate is certainly concerning.

A Common Problem

Facebook is the most recent company to come under fire for showing inappropriate search results for “video of”, but the problem isn’t limited to the organization at all. Other companies have also dealt with similar issues such as YouTube and Google that also rely on algorithms to predict phrases and provide accurate results.

At the end of the day, this appears to be a problem with the algorithm working too well and not discriminating when it comes to showing “video of” results that many find offensive. Facebook was quick to take action and it’s not likely you’ll find such a search result as of the time of this article, but it certainly goes to show that algorithms don’t have the same morals or common sense that a human does. Moving forward, companies may have to police their search engines a little more closely to avoid disasters like the one that happened today. While there were likely no actual videos that cropped up, it certainly holds the potential to be a PR disaster for any organization that doesn’t take quick action to rein in these issues.

Leave a Comment