Users are angry about Facebook Inc (FB) emotion-manipulation study
Recently, a study was published about how Facebook Inc (NASDAQ:FB) manipulated News Feeds. A team of social scientists from Cornell and the University of California-San Francisco released a study detailing how emotions expressed in posts and status updates can “spread” to your friends. The research team at Facebook selected 689,003 users to run the experiment on. Some users experienced a reduced amount of positive news on their News Feed and others saw a reduced amount of negative news.
Users with less positive news started to use more negative words. And users with reduced negative news used more positive words. This information was revealed by Jeff Hancock, a professor at Cornell’s College of Agriculture and Life Sciences. He is also the co-director of the Social Media Lab at Cornell.
The research team noticed the “contagion” effect and a withdrawal effect where people exposed to fewer emotional posts tended to post less expressive updates.
“This observation, and the fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,” said Hancock in a statement. “In fact, this is the result when people are exposed to less positive content, rather than more.”
Experiments suggest that the “emotional contagion” was applicable to real-life situations, which means that interacting with happy people will make you happier. However, the Cornell study suggests that the contagion applies when people are exposed to emotion not just when they are experiencing an interaction.
Because of Facebook’s data-use policy, the researchers never saw the content of the posts, but instead counted the amount of positive and negative words in over 3 million posts with a total of 122 million words. According to the study, 4 million words were positive and 1.8 million were negative.
It is believed that Facebook conducted the experiment for one week in January 2012. The results from the study were published in the journal PNAS (Proceedings of the National Academy of Sciences).
Facebook is legally allowed to do this because users agree to give up their data for analysis when signing up for the service. People are not criticizing Facebook for the research, but because data was manipulated without prior consent or knowledge.
Psychologists generally follow the practice of obtaining “informed consent” from study participants before conducting experiments. Facebook’s user agreement does not mention data manipulation.
“When psychologists conduct research … they obtain the informed consent of the individual or individuals using language that is reasonably understandable to that person,” says the The American Psychological Association.
Facebook’s data usage policy runs for around 10,000 words, but it does not mention that user information can be studied.
“None of the data used was associated with a specific person’s Facebook account,” a Facebook spokesperson told Forbes. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do, and have a strong internal review process,” said Facebook about the study.