Skip to content
In this June 11, 2014 photo, a man poses for photographs in front of the Facebook sign on the Facebook campus in Menlo Park, Calif. In the coming weeks, Facebook will start offering advertisers another way to tailor ads in the U.S., based on information gathered from other websites you visit and the apps you use. (AP Photo/Jeff Chiu)
In this June 11, 2014 photo, a man poses for photographs in front of the Facebook sign on the Facebook campus in Menlo Park, Calif. In the coming weeks, Facebook will start offering advertisers another way to tailor ads in the U.S., based on information gathered from other websites you visit and the apps you use. (AP Photo/Jeff Chiu)
Author
PUBLISHED: | UPDATED:

MENLO PARK — Facebook is facing a firestorm of outrage this week over an experiment in which researchers temporarily tweaked the contents of nearly 700,000 users’ news feeds — without their knowledge — to test their emotional response to seeing more positive or negative news from friends.

As word of the one-week experiment spread online, some users, legal experts and even medical researchers accused Facebook of treating the test subjects like lab rats by deliberately manipulating their emotions in ways that could potentially cause harm.

Facebook downplayed the study Monday in a statement that characterized it as just one of many tests the company conducts to make the social network “more relevant and engaging.” Defenders pointed out that Internet companies like Facebook, Google and Yahoo are constantly testing users’ reactions to different types of content, including advertising, in ways that determine what each user sees in the future.

But this particular experiment struck a nerve with many.

“People suffering from severe depression or on the verge of suicide could have been very adversely affected,” complained San Francisco artist Susan Lien Whigham in a Facebook post over the weekend. She added: “Shame on you Facebook. Whether or not it’s legally permissible, doing social experiments on people without their permission is ETHICALLY WRONG.”

Other critics raised questions about the role that researchers from Cornell and UC San Francisco played in the project, since academic and government researchers are required to get informed consent for research in which human subjects could suffer harm, and to submit proposed studies for ethical review.

Facebook researchers said the research was allowed under the company’s data use policy, although critics said most users probably never read or noticed a vague reference to “research” in that 9,000-word document.

By Sunday, however, Facebook data scientist Adam Kramer had posted an apology of sorts.

Without conceding specific errors or lapses in the project, Kramer wrote on his own Facebook page: “I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the (academic) paper described the research and any anxiety it caused.”

While the experiment was conducted more than two years ago, the researchers described their findings this month in the Proceedings of the National Academy of Sciences. Kramer said they wanted to test a widely held belief that some Facebook users get depressed when they see frequent updates from friends who seem to be having more fun than they are. The test results, he said, debunk that notion.

During the weeklong experiment, Facebook adjusted the news feeds of users in two test groups, by subtracting a certain number of “positive” or “negative” friend updates from each group. Using software to search for words indicating happiness or sadness, the researchers said people who saw fewer negative updates were more upbeat in their own posts, while those who saw fewer positive posts reacted negatively.

Researchers also reported a “withdrawal effect,” in which users who saw fewer emotional posts, whether positive or negative, tended to be “less expressive overall on the following days.”

The study builds on a fact that’s not universally known: Facebook’s primary news feed doesn’t show every item posted by a user’s friends. Instead, Facebook algorithms select only a portion of the available updates, based on such factors as how often the user has liked or commented on similar posts or her previous interactions with the person who posted an update. In effect, the researchers in this study simply changed the algorithm for certain users.

While Kramer insisted over the weekend that the “actual impact” on users was minimal, critics objected that the test effectively tinkered with the subjects’ emotional well-being.

In a blog post, University of Maryland law professor James Grimmelmann said the study’s unwitting participants “were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That’s psychological manipulation.”

Others said the concerns are overblown. “Facebook simply removed a variable proportion of status messages,” Tal Yarkoni, a psychology researcher at the University of Texas, Austin, wrote on his blog. “I hope that people who are concerned about Facebook ‘manipulating’ user experience in support of research realize that Facebook is constantly manipulating its users’ experience.”

The debate highlights “a big gap in what consumers really understand about platforms like Facebook,” said Irina Raicu, director of the Internet ethics program at Santa Clara University’s Markkula Center, who believes most users don’t give much thought to why they see particular items on the site.

Raicu said the study also raises concerns that private research may not meet the same ethical rules imposed on government and academic scientists. While Cornell had issued a news release touting the study, the university said in a statement Monday that faculty members “did not participate in data collection” and only helped analyze information Facebook gathered.

Facebook, meanwhile, stressed that users were never identified in the study. Kramer also wrote that the company is updating its research standards, including “what we’ve learned from the reaction to this paper.”

Contact Brandon Bailey at 408-920-5022 or follow him at Twitter.com/BrandonBailey.

Facebook’s data policy

Researchers said their controversial experiment on emotional reactions was done with “informed consent” because all Facebook users agree to the company’s 9,000-word Data Use Policy, which includes the following statement:
“For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”