Skip to main content

Did Facebook's experiment violate ethics?

By Robert Klitzman
updated 8:51 AM EDT, Wed July 2, 2014
  • Facebook conducted a study on nearly 700,000 users by manipulating their news feeds
  • Robert Klitzman: Facebook basically tried to alter people's mood without their knowledge
  • He says despite Facebook's user policy, this study violates accepted research ethics
  • Klitzman: We should try to avoid as much as possible becoming human guinea pigs

Editor's note: Robert Klitzman is a professor of psychiatry and director of the Masters of Bioethics Program at Columbia University. He is author of the forthcoming book, "The Ethics Police?: The Struggle to Make Human Research Safe." The opinions expressed in this commentary are solely those of the author.

(CNN) -- Like many people, I use Facebook to keep up with friends about all kinds of things -- deaths, births, the latest fads, jokes.

So I was disturbed to learn about an article, "Experimental Evidence of massive-scale emotional contagion through social networks" published last week in The Proceedings of the National Academy of Science (PNAS).

Facebook had subjected nearly 700,000 users in an experiment without their knowledge, manipulating these individuals' news feeds, reducing positive or negative content, and examining the emotions of these individuals' subsequent posts.

Robert Klitzman
Robert Klitzman

Facebook essentially sought to manipulate people's mood. This is not a trivial undertaking. What if a depressed person became more depressed? Facebook says that the effect wasn't large, but it was large enough for the authors to publish the study in a major science journal.

This experiment is scandalous and violates accepted research ethics.

In 1974, following revelations of ethical violations in the Tuskegee Syphilis study, Congress passed the National Research Act. At Tuskegee, researchers followed African-American men with syphilis for decades and did not tell the subjects when penicillin became available as an effective treatment. The researchers feared that the subjects, if informed, would take the drug and be cured, ending the experiment.

Facebook's 'creepy' mood experiment

Public outcry led to federal regulations governing research on humans, requiring informed consent. These rules pertain, by law, to all studies conducted using federal funds, but have been extended by essentially all universities and pharmaceutical and biotech companies in this country to cover all research on humans, becoming the universally-accepted standard.

According to these regulations, all research must respect the rights of individual research subjects, and scientific investigators must therefore explain to participants the purposes of the study, describe the procedures (and which of these are experimental) and "any reasonably foreseeable risks or discomforts."

Facebook followed none of these mandates. The company has argued that the study was permissible because the website's data use policy states, "we may use the information we receive about you...for internal operations, including troubleshooting, data analysis, testing, research and service improvement," and that "we may make friend suggestions, pick stories for your News Feed or suggest people to tag in photos."

But while the company is not legally required to follow this law, two of the study's three authors are affiliated with universities -- Cornell and the University of California at San Francisco -- that publicly uphold this standard.

The National Research Act led to the establishment of local research ethics committees, known as Institutional Review Boards (or IRBs), which can waive the informed consent requirement in certain instances, provided, "whenever appropriate, the subjects will be provided with additional pertinent information after participation" -- that is, researchers should "debrief" the participants afterwards.

Such a debriefing apparently did not occur here, but easily could have. Facebook said it reviewed the research internally, but there is no evidence that that review was by an IRB or met the standards of the federal regulations.

Moreover, the journal, PNAS, mandates that "all experiments have been conducted according to the principles expressed in the Declaration of Helsinki," which also dictates that subjects be informed of the study's "aims, methods...and the discomfort it may entail."

The lead author, Adam Kramer, apologized on Facebook, writing, "my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused." But that statement falls far short. The problem is not only how the study was described, but how it was conducted.

Many researchers try to avoid having to obtain appropriate informed consent, since they worry that potential subjects, if asked, would refuse to participate. Pharmaceutical, insurance and Internet companies and others are increasingly studying us, acquiring massive amounts of data about us -- about not only our Internet use, but our genomes and medical records. Many medical centers are building enormous biobanks. Countless websites now examine our behavior online. They ask us to scroll down and click "I accept," assuming we're unlikely to read the dense legalese and simply accept their terms.

In July 2011, President Obama released proposals to improve the current system of oversight on human research. The federal Office of Human Research Protections received public comments for a few months but appears to have put this on the back burner.

Social scientists have complained that the current regulations are onerous and that their research should be excused from IRB review.

The current system is overly bureaucratic and needs reform. But as this controversial Facebook experiment suggests, it should not be scrapped.

Good experiments benefit society. But in their zeal to conduct research, some social scientists overlook how their studies may impinge on people's rights. As the amount of research on humans continues to grow, more violations will probably occur. We should try to avoid as much as possible becoming human guinea pigs.

Read CNNOpinion's new Flipboard magazine

Follow us on Twitter @CNNOpinion.

Join us on

Part of complete coverage on
updated 8:51 AM EST, Wed December 31, 2014
Pilot Bill Palmer says the AirAsia flight had similarities to Air France 447, which also encountered bad weather
updated 8:29 AM EST, Wed December 31, 2014
Poverty isn't the only reason why so many parents are paying to have their child smuggled into the United States, says Carole Geithner
updated 11:49 AM EST, Wed December 31, 2014
Michael Rubin says it's a farce that Iranian Supreme Leader Ali Khamenei posted tweets criticizing U.S. police
updated 1:40 PM EST, Wed December 31, 2014
Ron Friedman says your smartphone may be making you behave stupidly; resolve to resist distractions in 2015
updated 8:32 AM EST, Tue December 30, 2014
Artificial intelligence does not need to be malevolent to be catastrophically dangerous to humanity, writes Greg Scoblete.
updated 8:27 PM EST, Fri December 26, 2014
The ability to manipulate media and technology has increasingly become a critical strategic resource, says Jeff Yang.