It does not take a whistleblower to know that Facebook’s vast platform is used to spread hate and misinformation.
But now we do have a whistleblower: former Facebook product manager Frances Haugen. She alleges that the scale of the problem is much worse than the company lets on or the public understands, that Facebook hides evidence the platform is being used in this way and that its algorithm turbocharged US divisions by feeding hateful and wrong content.
The list goes on.
Her interviews with the Wall Street Journal, followed by her appearance on “60 Minutes” this past Sunday night and now in front of a congressional committee on Tuesday, will further focus attention on the idea that the most democratizing invention of recent generations – a free website that allows strangers as well as friends to find each other, to organize and to speak their minds – now poses a threat to democracy by not just allowing but actively promoting false narratives, conspiracy theories and hate.
Related: Lawyer explains what could happen to Facebook whistleblower
Haugen’s trove of documents, which include internal Facebook research she copied before leaving her job as a product manager this year, suggest that Facebook:
- Knows it has caught only a small fraction of the misinformation spread on the site.
- Abandoned some efforts to cut down on misinformation after the presidential election.
- Knows the effect Instagram has on some young minds.
Read more on her “60 Minutes” interview here.
Previously, the Wall Street Journal reported on how the drug pushers and human traffickers use the platform, and how Instagram creates a feedback loop hurting the mental health of US kids, particularly girls. You might recall an earlier whistleblower who alleged Facebook was complicit in the sale of opioids on its platform.
Key takeaway from Haugen: When misinformation is the main weapon in an ongoing war of ideas, the massive company has chosen the profits it reaps from capturing eyeballs and engaging users over its moral responsibility to cut down on the toxic stuff spread on its platform.
Haugen gave the example that if a new user signed up and followed Donald Trump, it wouldn’t be long before the platform’s algorithm was pushing them QAnon conspiracy theories.
She told the Wall Street Journal that she does not want people to dislike or stop using the platform, but that it needs to be fixed.
“If people just hate Facebook more because of what I’ve done, then I’ve failed,” she told the Journal. “I believe in truth and reconciliation – we need to admit reality. The first step of that is documentation.”
Everyone – even Facebook – seems to agree there should be some regulation of social media companies that requires more responsibility for what’s on their platforms. But Facebook very much wants to steer that regulation.
What does Facebook say? The company called Haugen’s allegations misleading and said the platform does more good than harm. In a prebuttal to Haugen’s “60 Minutes” interview, Facebook VP and former British liberal politician Nick Clegg talked at length with CNN’s Brian Stelter.
Here are some quotes from this excellent interview, which includes Clegg agreeing that there should be some regulation.
Is Instagram toxic to teenage girls? Not to all teenage girls, Clegg argued.
STELTER: For teenage girls, is the world better with Instagram in it or is it worse?
CLEGG: Well, the vast majority of teen girls and, indeed, boys who have been covered by some of the surveys that you referred to say that for the overwhelming majority of them, it either makes them feel better or it doesn’t make a difference one way or the other.
If you’re skeptical that teen girls are exposed to inappropriate content that could foster eating disorders, read this CNN Business report published Monday, where Instagram acknowledges promoting pages glorifying eating disorders to teen accounts. This is dark stuff.
Why doesn’t Facebook release the kind of research that Haugen leaked? Clegg argued that Facebook has more than 1,000 Ph.D.s on staff and does a lot of research and not all of it is meant to be public. This research, he argued, was meant to help Facebook improve its platforms.
CLEGG: So we do a huge amount of research. We share it with external researchers as much as we can. But do remember, there is a – and I’m not a researcher, but researchers will tell you that there’s a world of difference between doing a peer-reviewed exercise, in cooperation with other academics, and preparing papers internally to provoke an informed internal discussion.
Is Facebook like the tobacco companies? Drawing the much-repeated comparison between Facebook trying to hook users and tobacco companies trying to hook smokers, Stelter said he enjoys Instagram but he does feel the pull of an addiction to it.
CLEGG: Let me give you one very simple reason why this is such a misleading analogy. The people who pay our lunch are advertisers. Advertisers don’t want their content next to hateful, extreme or unpleasant content.
Is Facebook a monster that’s too big to control? Possibly.
CLEGG: Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, you’re right, Brian. We’re never going to be absolutely on top of this 100% of the time, because this is an instantaneous and spontaneous form of communication, where billions of human beings can express themselves as they want, when they want, to each other.
Is Facebook responsible for the divisions that led to the January 6 insurrection? No, Clegg argued.
CLEGG: I think it gives people false comfort to assume that there must be a technological or a technical explanation for the issues of political polarization in the United States.
STELTER: You think it’s too easy – it’s too easy to say it’s Facebook’s fault?
CLEGG: Well – well, I think it would be too easy, surely, to suggest that with a tweak to an algorithm, somehow all the disfiguring polarization in US politics would suddenly evaporate. I think it absolves people of asking themselves the harder questions about the historical, cultural, social and economical reasons that have led to the politics that we have in the US today.
An odd coincidence. Making an already bad day worse, Facebook suffered an outage that robbed many, many people of their midday Instagram fix. From CNN Business: “I don’t know If I’ve seen an outage like this before from a major internet firm,” said Doug Madory, director of internet analysis at the network monitoring firm Kentik. For a lot of people, Madory told CNN, “Facebook is the internet to them.” More here.
CORRECTION: An earlier version of this article misstated part of Facebook’s response to whistleblower Frances Haugen’s allegations. The company said the platform does more good than harm.