Facebook Manipulated News Feeds in Psychological Experiment
Facebook recently released a study of user behavior, and unwittingly angered many of their billion-plus worldwide users. The company admitted to manipulating the news feeds of users to see if it had any affect on the users’ own posts, making Facebook members into (somewhat) unwitting test subjects.
The 2012 study examined 700,000 users. The algorithm for determining a users news feed was manipulated for some of the group, eliminating friends’ posts with negative words; for others, posts with positive words were eliminated; and for a control group, random posts were eliminated. Facebook data scientist Adam Kramer and his team were examining how friends’ posts affect user experience.
The results possibly disproved the idea of “Facebook envy,” whereby users feel bad about themselves when they see lots of good news from friends. Instead, the researcher found that the subjects with more positive feeds posted more positively afterwards, and those with more negative feeds posted more negatively.
But an uproar ensued over the fact that Facebook had used users as test subjects, and not passively studying their use habits, but actively manipulating them. Though warnings of studies are included in Facebook’s Terms of Service, many debates exist about what can reasonably be included in these rarely-read documents.
Facebook claims that after it conducted the research, it had the methods approved by Cornell University’s ethics review board for human testing. A Princeton psychologist edited the study for publication, and it was published in an academic journal.
But all that might not mean much to Facebook users who feel betrayed. “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” wrote Kramer in response to the backlash.