Skip to main content

When is Facebook NOT messing with your head?

By David Weinberger
July 1, 2014 -- Updated 1243 GMT (2043 HKT)
STORY HIGHLIGHTS
  • David Weinberger: Many angry over Facebook's psychological experiment on users in 2012
  • He says it was only a more intrusive version of what FB and other websites do all the time
  • FB decides what we'll know about friends based on what profits FB, he says
  • Weinberger: Facebook could use this control to better society, but the bottom line comes first

Editor's note: David Weinberger is a senior researcher at Harvard University's Berkman Center for Internet & Society and author of "Too Big to Know" (Basic Books). The opinions expressed in this commentary are solely those of the author

(CNN) -- Many people are outraged about the just-revealed psychological experiment Facebook performed in 2012 on 690,000 unwitting people, altering the mix of positive and negative posts in their feeds. Playing with people's emotions without their consent is a problem. But it would be even worse if we think -- after Facebook posts one of its all-too-common apologies -- that Facebook is done manipulating its users.

No. The experiment was only a more intrusive version of what the company does every time we visit our Facebook page.

Facebook's experiment was a version of so-called "A/B" testing, one of the most widely used and effective techniques large websites use to "provide a better customer experience" -- that is, to sell us more stuff.

For example, for years Amazon has routinely experimented with seemingly insignificant changes to its pages, like showing half of its visitors a discount offer on the left side, and the same ad on the right to the other half. If Amazon finds a statistically significant uptick in clicks on the offer when it's on one side, from then on that's where they put the offers. Companies A/B test every parameter about a page, from font sizes to colors to the depth of the drop shadows.

But the Facebook experiment was not normal A/B testing. Usually a test alters some seemingly irrelevant factor. But Facebook's experiment changed the site's core service: showing us what's up with our friends. Worse, Facebook did so in a way likely to affect the emotional state of its users. And that's something to be concerned about.

But much of the outrage is driven by a false assumption: that there is a "real" mix of news about our friends.

There isn't. Facebook always uses algorithms to figure out what to show us and what to leave obscure. Facebook is in the business of providing us with a feed that filters the Colorado River rapids into a tinkling stream we can drink from.

The 2012 experiment is a window onto this larger concern: Facebook, an important and even dominant part of our social infrastructure, makes decisions about what we'll know about our friends based on what works for Facebook, Inc., and only secondarily based on what works for us as individuals and as a society.

This point is illustrated in Eli Pariser's excellent book (and terrific TED Talk) The Filter Bubble. Facebook filters our feeds to make us happier customers. But Facebook defines a happy customer as one that comes back often and clicks on a lot of links.

Did Facebook study go too far?

When it comes to politics, we can easily see the problem: Showing us news that excites our click finger is a formula for promoting shouting and political divisiveness. Too much of that is bad, but in both politics and social relationships more broadly, do we know what the "right mix" is?

Are we sure that filtering social news so that it includes more of the negative is bad? And positive filtering can paint a too-rosy picture of our social network, shielding us from the full force of life as it is actually lived. I don't know the answer, but it can't come from a commercial entity whose overriding aim is to keep us coming back so we can buy more from its advertisers.

There are many options to play with here. For example, we could be given more individual control over our own filters. Or a site could "nudge" us toward feeds that achieve socially desirable aims like making us more willing to explore and embrace differences.

But we're unlikely to see such options so long as we have given control over our the flow of our social information to commercial entities that have as their primary interest not the health of our society and culture, but their bottom line. Sometimes those interests may align, but not reliably or often enough.

So, I'm upset about Facebook's cavalier toying with our emotions, but I'm far more disturbed about what Facebook and other such sites do all the time.

Read CNNOpinion's new Flipboard magazine.

Follow us on Twitter @CNNOpinion.

Join us on Facebook.com/CNNOpinion.

ADVERTISEMENT
Part of complete coverage on
August 22, 2014 -- Updated 1231 GMT (2031 HKT)
James Dawes says calling ISIS evil over and over again could very well make it harder to stop them.
August 22, 2014 -- Updated 1223 GMT (2023 HKT)
Retired Lt. Gen. Mark Hertling says he learned that the territory ISIS wants to control is amazingly complex.
August 21, 2014 -- Updated 1450 GMT (2250 HKT)
David Weinberger says Twitter and other social networks have been vested with a responsibility, and a trust, they did not ask for.
August 22, 2014 -- Updated 1103 GMT (1903 HKT)
John Inazu says the slogan "We are Ferguson" is meant to express empathy and solidarity. It's not true: Not all of us live in those circumstances. But we all made them.
August 20, 2014 -- Updated 1951 GMT (0351 HKT)
Cerue Garlo says Liberia is desperate for help amid a Ebola outbreak that has touched every aspect of life.
August 21, 2014 -- Updated 1742 GMT (0142 HKT)
Eric Liu says Republicans who want to restrict voting may win now, but the party will suffer in the long term.
August 21, 2014 -- Updated 1538 GMT (2338 HKT)
Jay Parini: Jesus, Pope and now researchers agree: Wealth decreases our ability to sympathize with the poor.
August 21, 2014 -- Updated 1200 GMT (2000 HKT)
Judy Melinek offers a medical examiner's perspective on what happens when police kill people like Michael Brown.
August 19, 2014 -- Updated 2203 GMT (0603 HKT)
It used to be billy clubs, fire hoses and snarling German shepherds. Now it's armored personnel carriers and flash-bang grenades, writes Kara Dansky.
August 20, 2014 -- Updated 1727 GMT (0127 HKT)
Maria Haberfeld: People who are unfamiliar with police work can reasonably ask, why was an unarmed man shot so many times, and why was deadly force used at all?
August 18, 2014 -- Updated 2152 GMT (0552 HKT)
Ruben Navarrette notes that this fall, minority students will outnumber white students at America's public schools.
August 19, 2014 -- Updated 2121 GMT (0521 HKT)
Humans have driven to extinction four marine mammal species in modern times. As you read this, we are on the brink of losing the fifth, write three experts.
August 19, 2014 -- Updated 1158 GMT (1958 HKT)
It's been ten days since Michael Brown was killed, and his family is still waiting for information from investigators about what happened to their young man, writes Mel Robbins
August 18, 2014 -- Updated 1242 GMT (2042 HKT)
The former U.K. prime minister and current U.N. envoy says there are 500 days left to fulfill the Millennium Goals' promise to children.
August 20, 2014 -- Updated 1738 GMT (0138 HKT)
Peter Bergen says the terror group is a huge threat in Iraq but only a potential one in the U.S.
August 18, 2014 -- Updated 2006 GMT (0406 HKT)
Pepper Schwartz asks why young women are so entranced with Kardashian, who's putting together a 352-page book of selfies
ADVERTISEMENT