When I first heard that Facebook was messing with our emotions by manipulating the stories some users saw in their News Feed, my first thought was: Duh.

News Feed plays with my heart and mind every daythats kind of why I still check it. I want to see the myriad creative ways people have chosen to announce their engagements and pregnancies. I want to see photos of my far-away family. I could do without the people who constantly complain about their lives or the ones who seek attention with their romantic escapades, but thats all part of the messiness of Facebook.

But the networks week-long psychological experiment feels more sinister. Facebook was trying to figure out if people can transfer emotions to one anotherfor instance, if you surround yourself with people who are sad or depressed, that could lead to your own sadness. So the network manipulated the News Feeds of nearly 700,000 users, reducing negative emotional posts for one group and reducing positive emotional posts for the second. The experiment spanned a weeks worth of News Feed posts, from Jan. 11-18, 2012.

The results were published in an academic journal this week, immediately incensing the Twittersphere and the blogosphere, but not so much the Facebook-sphere. (Not one peep about the experiment in my own News Feed, at least.)

Facebooks future experiments

I couldnt quite muster the outrage others were feeling. Facebook is a corporation to which I willingly hand over my information. Though I dont explicitly consent to all of the changes they make, theyre free to do whatever they want, and Im free to leave. That doesnt mean Mark Zuckerberg shouldnt feel a guilty twinge over some of the decisions hes made in the name of improving Facebookand this experiment is just the latest example. In some respects, like privacy, Facebook has figured out that enraging its users is just bad PR. But it took many, many, many missteps to get to that point. (Lets take a moment to remember Beacon, Sponsored Stories, and the original News Feed, which many thought was beyond creepy.)

Going forward, the network should make psychological experiments opt-in and open, not some weird secret. Facebook should also give careful consideration to whether there are any real benefits to testing its users this way.

Facebook data scientist Adam Kramer defended the research in a Sunday Facebook note. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling left out, he wrote. At the same time, we were concerned that exposure to friends negativity might lead people to avoid visiting Facebook.

The research found the opposite, that positivity begets positivity and negativity begets negativity. But the effects were so small and the methodology so questionableturns out its pretty tough to parse someones emotional state from a Facebook postthat the experiment just doesnt seem worth it.

In hindsight, the research benefits of the paper may not have justified all of this anxiety, Kramer added.

Facebook needs to remember that its users are human beings, not data points. And if you dont like being used as a guinea pig, you can always leave the network altogether.