Facebook’s Psychological Experiment Secretly Manipulated News Feeds of Users in 2012
SHARE
, / 260 0
According to the study, Facebook uses software to ascertain which of the nearly 1,500 positive and negative available posts will show up in a user's News Feed

According to the study, Facebook uses software to ascertain which of the nearly 1,500 positive and negative available posts will show up in a user's News Feed

Facebook has secretly manipulated the news feeds of users in an attempt to gauge their emotional response in a study that has prompted outrage among users who are criticising the ethics behind the experiment.

In early 2012, Facebook tweaked the feeds of 690,000 users for a week to show a disproportionate number of positive or negative posts. The study found that those users who were shown more negative posts were influenced to end up posting something negative themselves and users in the positive group responded with more upbeat content. While the mood changes were slight, the researchers said that the findings have huge implications given the size and scale of the social network.

The results of the study, conducted by researchers from Cornell, the University of California, San Francisco and Facebook, were published this month in the prestigious academic journal PNAS (Proceedings of the National Academy of Science).

Legally, Facebook's term of service gives them the permission to conduct this kind of research, as users sign up for the social network; they agree to share their data for analysis, testing and research. However, it's not the study people are concerned about; it's the decision to execute it without users' explicit consent.

According to the study, Facebook uses software to ascertain which of the nearly 1,500 positive and negative available posts will show up in a user's News Feed. The company frequently alters this program to modify the mix of news, personal stories and advertisements seen by users.

The Facebook researcher who designed the experiment, Adam D. I. Kramer, said in a post Sunday that the research was an attempt to improve the service and not to upset users.

"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused," Kramer wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Facebook, which has more than one billion active users said in a statement that the data used in the study was not associated with a specific person's Facebook account. The company went on to add that they frequently do researches to better their services and all of the data collected with these research initiatives was stored securely.

It appears that Facebook will not face any legal implications due to the company’s terms of service, but given the nature of the experiment, manipulating user’s data without prior consent or knowledge does raise some ethical questions.

 

Author
Follow Anna Domanska on Twitter, Facebook & Google.

Register today to get full access to:

All articles | Magazine archives | Livestream events | Comments

PASSWORD RESET


Register today to get full access to:

All articles | Magazine archives | Livestream events | Comments

LOGIN