Facebook Manipulated News Feeds To See If Online Emotions Can Spread

This is the only appropriate response to Facebook's social engineering. (courtesy Sean McEntee's Flickr feed)
This is the only appropriate response to Facebook’s social engineering. (courtesy Sean McEntee’s Flickr feed)

Facebook has spent the better part of a year trying to prove that its users’ privacy really is protected. Well, that task just became several times harder with news that a team of scientists has been manipulating our news feeds to see if it’s possible for online emotions to spread from one person to another.

It’s well known that when you spend a lot of time around a person who’s sad all the time, you’ll end up feeling down yourself. Facebook wanted to know if this could happen online as well, so it had one of its scientists, Adam Kramer, team up with UCSF’s Jamie Guillory and Cornell’s Jeffrey Hancock to find out. The results of their study were published in the National Academy of Sciences’ official journal, PNAS. They randomly selected 689,003 Facebook users and spent a week in January 2012 manipulating their news feeds. Some users got posts with a higher number of positive words, while others got more posts with negative words.

The study found that when our friends express their feelings on Facebook, it does indeed have a big influence on our feelings, “constituting experimental evidence for massive-scale contagion via social networks.” Whenever news feeds were manipulated to increase positive emotions, people produced more positive posts. The same happened with increased negative posts. In other words, Facebook has the same ability to make you feel good or bad as people do in the real world.

Although it would seem that such an experiment raises astronomical ethical problems, the study was considered legitimate due to Facebook’s Data Use Policy. Specifically, a clause in the “Information we receive and how it is used” section states that by creating a Facebook profile, you agree that information Facebook gets from you can be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” In other words, creating an account on Facebook served as informed consent for this research. While this research may be technically legal and legitimate, the fact it relied on a browse-wrap agreement should leave a bad taste in anyone’s mouth.

According to Buzzfeed tech editor Charlie Warzel, this isn’t the first time that Facebook has turned its users into guinea pigs. Warzel happened on a 2012 article from MIT Technology Review that revealed no less a personality than Mark Zuckerberg used Facebook to drive up organ donor enrollment. He added a box to our timelines that allowed users to say whether or not they were organ donors. The tsunami of social pressure resulted in donor enrollments going up almost 23-fold in 44 states.

I’ve had my personal Facebook profile set to “friends only” for a long time. This just adds to the myriad of reasons why. Let us know if you feel the same way at the Liberal America Facebook page!


Darrell Lucus.jpg Darrell Lucus is a radical-lefty Jesus-lover who has been blogging for change for a decade. Follow him on Twitter @DarrellLucus or connect with him on Facebook.

Darrell is a 30-something graduate of the University of North Carolina who considers himself a journalist of the old school. An attempt to turn him into a member of the religious right in college only succeeded in turning him into the religious right's worst nightmare--a charismatic Christian who is an unapologetic liberal. His desire to stand up for those who have been scared into silence only increased when he survived an abusive three-year marriage. You may know him on Daily Kos as Christian Dem in NC. Follow him on Twitter @DarrellLucus or connect with him on Facebook. Click here to buy Darrell a Mello Yello.