Was Facebook Inc (NASDAQ:FB) attempting a near impossible task when a lab experiment with 700,000 of its users as mice first sought to discover and then alter the emotional state of consumers of its news feed? Or did the experiment really discover that computer algorithms are bad at detecting subtly, sarcasm and nuance in human communication?

Facebook Emotional Manipulation

Facebook’s content altering experiment

As you may have heard, and as reported in ValueWalk, Facebook Inc (NASDAQ:FB) conducted a psychological experiment on nearly 700,000 of its users by altering the content of its news feed to reflect either a more positive or a more negative world view.  They were trying to prove a concept that “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”

A new Wired article notes that while Facebook’s breach of informed consent seems pretty plain, what’s less clear is how much “emotion contagion” the experiment actually inflicted. This is due to the fact that the computer algorithms might have been slightly faulty because of their lack of sophistication at detecting nuance, sarcasm, double entendre and other complex human sentiments. Further, when compared to flip of a coin probability being 50 percent, how much did the experiment really move the needle?

Facebook’s experiment depends on “sentiment analysis” algorithms

Facebook’s experiment, the article notes, depends on “sentiment analysis”—algorithms that analyze text in an effort to “tease out the emotions behind the words.”  Even today’s most sophisticated algorithmic tools, while fascinating and increasingly powerful, still offer only ham-fisted approximations, or partial glimpses at best, of anyone’s emotional leanings. “Computers, it turns out, still have a long way to go before they can really figure out how you feel, and that means Facebook’s ability to understand and influence on your feelings is limited, too.”

“Computers just aren’t very good at subtlety,” London-based computer programmer Jonty Waering was quoted as saying. “The issue with this method is a complete inability to deal with sarcasm or words that can be used in a positive sense in specific contexts.”

As just one example, he explains the phrase “damn good” would rank as negative because “damn” has a stronger negative connotation in the ranking system than “good” has positive, even though it is “obviously exceptionally positive.”  In response to the Facebook study, which he called “unethical,” Waering wrote his own sarcastic browser extension called “A Better Place” that filters all negative tweets out of your Twitter stream.

Even in the most stripped-down scenario, assuming everyone is being honest and unironic in their posts sentiment analysis still has a long way to go, the article notes.

But the real issue for a company such as Facebook Inc (NASDAQ:FB), with such a checkered history when it comes to transparency and privacy, is to alter the news feed content with the desire to manipulate its users was a doomed project that didn’t need an algorithm to predict the public response.