Facebook Inc (NASDAQ:FB) stated in a research paper that it conducted an experiment that included the accounts of approximately 700,000 users to test the effects of transferring their emotions online. The researchers at Facebook changed the News Feed of these users to examine “whether exposure to emotions led people to change their own posting behaviors.”
Facebook alters algorithm
The total number of users was segregated, where one group received more positive posts and another who received the more emotionally negative posts from their friends. Facebook has not altered the actual posts, which could still be read from friends’ profiles, but instead it created an algorithm that edited the content viewed by targeted users on their News Feed.
The social networking company did an interesting observation where those who saw positive content did more positive activity on their Facebook on an average compared to those who were shown more negative postings in their News Feed. While it has been proved that an individual goes upset seeing his or her friend sad, this also holds true on Facebook where the researchers observed that “textual content alone appears to be a sufficient channel” for the negative effect.
Is this breach of privacy?
According to Facebook Inc (NASDAQ:FB), this is the first experiment that shows this kind of results and is one of the largest scale to date. However, users may not be delighted to hear this. The company said that the users grant permission to the website, when they sign up. Even then conducting experiment without informing the account holder can be seen as an abuse of the social network’s popularity and position.
Data scientist Adam Kramer, who is also the co-editor of the report, told BBC, “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.” Also, during the study, the researchers were concerned that the negative posts of the friends may cause disengagement of users from Facebook Inc (NASDAQ:FB).
Kramer accepted the fact that Facebook could not convey its reason behind conducting the research properly to the users. Kramer said that he understands why some people have issues with the report and that he along with his co-authors are very sorry for the way the paper described the research.