The Instagram app is the latest victim of faulty algorithms. The Facebook-owned platform used a hateful photo reading, “I Will Rape You,” to advertise itself on Facebook, according to The Guardian.
The Guardian reporter Olivia Solon stumbled upon a screenshot of a hateful email that she had received in the past.
The email read, “I will rape you before I kill you, you filthy whore!” The subject line of the email was “Olivia, you fucking bitch!!!!!!!!”
According to Solon, the Instagram app used the screenshot of that email to invite her friends to come onto the platform.
“Instagram is using one of my most ‘engaging’ posts to advertise its service to others on Facebook,” she tweeted.
The accidental photo got three likes and over a dozen sympathetic comments, so Facebook found it “engaging.”
Surely, the notorious algorithms are in play here. However, what’s more concerning is the fact that it is not clear if the Instagram app has any system in place to pick out violent and abusive text and report it as an inapt choice for an advertisement.
Following the issue, a spokesperson for the Instagram app apologized and said that the image was not used for “paid promotion.”
“We are sorry this happened – it’s not the experience we want someone to have,” the photo-sharing app said in a statement to The Guardian.
However, Instagram explained that the notification post was part of a strategy to encourage engagement on the platform and that such posts are not displayed widely, but rather, to a small percentage of a person’s Facebook friends. The sole idea of such posts, according to the spokesperson, was to encourage people who are not using the Instagram app or have not been active on the platform.
This could be more big trouble for Instagram and Facebook, which just last week took flak for allowing advertisers to target users who describe themselves as “Jew haters” or interested in categories such as “how to burn Jews.” Following the incident, Facebook went on to exclude some of the interests that users can express and added a more human element to the automated process.
In a recent post, Sheryl Sandberg wrote, “We never intended or anticipated this functionality being used this way — and that is on us.”
This incident again shows the level of dependency that companies have for things like account verification and ad sales. However, instead of trying to change things, they simply furnish apology statements when something goes wrong.
The Instagram app is mainly used to share pictures, but it is being used for much more than that nowadays. Hundreds of Islamic group supporters recently used the platform to spread messages of terror and accidentally revealed the locations they uploaded their photos and videos from.
It was basically a privacy blunder as ISIS supporters might have forgotten to exclude their geo-location data while using the Instagram Stories feature, which allows users to submit a series of videos and pictures daily. After 24 hours, the Stories disappear from the user’s Instagram account. However, once the location of the shared photos is posted, it is possible to track users within the 24 hours following the message’s posting.