FB87707A

The furor surrounding Facebook’s decision to conduct an experiment that secretly manipulated the News Feed of some users to study emotion contagion reached a peak this weekend, with many calling the act creepy at best, and downright unethical at worst.

Although the editor of the study recently admitted to being “a little creeped out” by the way in which the study was conducted, Facebook itself had not offered any detailed comment on the matter — until now.

In a public post on Facebook, one of the co-authors of the study, Adam D. I. Kramer, a member of Facebook’s Core Data Science Team, finally responded to the study’s critics.

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” wrote Kramer. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.

We didn’t clearly state our motivations in the paper

We didn’t clearly state our motivations in the paper.”

But while Kramer’s initial statement regarding the company’s reasoning behind the study will be a welcome clarification for some, there’s still the matter of most import to the study’s critics: involving users in a psychological experiment without their consent (lengthy and sometimes vague Terms of Service agreements aside).

After summarizing the study’s methodology, then emphasizing that “Nobody’s posts were ‘hidden,’ they just didn’t show up on some loads of Feed,” Kramer then wades, in indirect fashion, into the delicate territory of how Facebook views the matter of user experiments on the site.

“[A]t the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it…” That’s about as close as Kramer comes to directly acknowledging that Facebook covertly manipulated its users for an experiment.

Later in the statement, he does offer a bit of contrition, writing, “

[O]ur goal was never to upset anyone. I can understand why some people have concerns about it

[O]ur goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.”

However, based on the vague wording of the statement, it’s unclear exactly what Kramer “understands” with regards to user concerns and anxieties. Similarly, rather than directly address the widely voiced concerns regarding the study’s involvement of users in an experiment without their knowledge, Kramer instead apologizes for the paper’s “description” of the experiment.

Reactions to the statement were swift, with commenters on the post almost evenly divided into camps of support for Facebook and those who still remain troubled by the company’s actions.

Source: http://mashable.com/2014/06/29/facebook-responds-to-negative-reactions-to-its-emotion-contagion-study/