Facebook defends super-creepy psychological newsfeed study

Facebook
Share

Over the weekend it emerged that last year Facebook carried out a study in 2012 on the impact of its news feed algorithms on the emotions of users… by deliberately manipulating their news feeds. Now the engineer behind it has defended the action.

The “Emotional Contagion” analysis, which was written up as a scientific paper and published in the Proceedings of the National Academy of Science journal worked by – for a week in January 2012 – having the news feed select fewer “emotional” posts to display to a smallish number of users, to see if it affected their behaviour.

As a result, it apparently turns out that there is indeed a ‘contagion’, with negative posts from your friends leading to more negative posts from you – and more positive posts leading to happier posts from you.

Whilst it sounds like an interesting study in theory, the internet reacted with fury over the weekend as the news broke: after all, it was done in secret… whatever happened to informed consent? More broadly too, it also raises questions about the secretive role services like Facebook have in shaping our conception of what is going on in the world.

Adam Kramer was one of the Facebook engineers behind the study and following the shitstorm, has posted something of a defence on his (where else?) Facebook wall. Here’s some choice highlights with some sarcastic comments by me after them:

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

Who needs individual agency when you’ve got the wonder-drug that is “Facebook”?

“Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.”

See, this is why doctors should be able to test new drugs on patients without their consent. “Hey, its only a very tiny proportion of people getting the mystery drug”, they say – so is it really that horrifying?

“And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.
The goal of all of our research at Facebook is to learn how to provide a better service.”

If Facebook were a romantic comedy, perhaps people would think this sort of manipulation is fine?

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

This is the most bizarre section. Kramer says “our goal was never to upset anyone”, yet by increasing and decreasing the various emotive words as part of a test… isn’t upsetting people (or making them happier) exactly what Facebook was trying to see if it could do?

Given the backlash, it’ll be interesting to see if Facebook try an experiment like this again. If only the company had some means of conditioning us so we’re more accepting next time…

James O’Malley
For latest tech stories go to TechDigest.tv