A version of this story appeared in CNN’s What Matters newsletter. To get it in your inbox, sign up for free here.
New documents – the Facebook Papers – are bringing more clarity to the social media giant’s problems and reinforcing a whistleblower’s claim that the platform profited off the spread of false information and relies on an algorithm that pushes fake news like drugs.
What’s happening now? A trove of documents obtained by CNN and 16 other news organizations shed light on new details about how Facebook allowed misinformation to be pushed before the 2020 election and may have helped spark the January 6 insurrection, and whose profit motive is directly tied to keeping people engaged on its site, even when the content is dangerous.
Are these documents new? Yes and no. They are the basis for Facebook whistleblower Frances Haugen’s complaints to the US Securities and Exchange Commission about the company. Her attorney provided the documents, which are redacted, to the SEC and Congress as part of those complaints. The redacted versions were obtained by the consortium of news organizations.
Why are we talking about this now? Previously, the Wall Street Journal wrote a series of reports on Haugen’s complaints and these documents, many of which are internal reports by Facebook about the harm it causes. Haugen, a former product manager at the company, described the documents during her media debut in early October as an activist for public oversight of tech companies.
What’s happening now is that journalists have access to the redacted documents. Haugen testified before the UK Parliament on Monday, alongside the Facebook Papers release.
What have we learned from these documents? A lot. These are some of the headlines from the Facebook Papers that CNN published in the past few days:
- The January 6 insurrection: Facebook Papers paint damning picture of company’s role in insurrection
- Violence in Africa: Facebook knew it was being used to incite violence in Ethiopia. Little was done to stop it.
- Human trafficking: Facebook employees flagged people for sale on its platforms in 2018. It’s still a problem.
- Oversight: Facebook may have misled its own oversight board
- Double standard: Whistleblower says Facebook’s content safety systems don’t apply similarly to non-English-speaking countries
Watch a video report from CNN’s Donie O’Sullivan on how Facebook’s own report found a test account the company made in 2019 for a fictional woman in North Carolina who followed former President Donald Trump’s account (which has since been suspended), among other popular conservative pages – and within days was being fed QAnon conspiracy theories.
What does Facebook say? CEO Mark Zuckerberg kicked off Facebook’s quarterly earnings call by addressing the latest wave of coverage on Monday.
“Good-faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company,” he said. “The reality is that we have an open culture that encourages discussion and research on our work so we can make progress on many complex issues that are not specific just to us.”
Is this scandal having an effect on Facebook? Yes. For starters, investors are already wary of the social media company.
CNN’s Paul R. La Monica writes that Facebook has lagged the other major internet stocks, like Apple, Amazon, Netflix and Google. It’s also lagged the NASDAQ. But Facebook’s not exactly losing money: Its stock is up nearly 75% since October 2019. That’s just not as much growth as the other tech giants. Facebook reported nearly $9.2 billion in profits for the most recent quarter.
La Monica adds there’s no indication that advertisers will suddenly pull ads off Facebook, and most stock analysts predict the stock will go higher, not lower. He writes, “… unless Facebook customers and users show they truly have had enough — in a manner that impacts ad revenue, earnings and the stock price in a much more meaningful way — then there may be little incentive for Facebook to change its stripes.”
Will Facebook regulate itself? Facebook appointed its own independent Oversight Board, a sort of Supreme Court for content decisions. They’re the ones who recommended not letting Trump return to the platform, for instance.
But there are frustrations on the board. One member, Suzanne Nossel, whose day job is running the free expression organization PEN America, told CNN’s Brian Stelter on Sunday that these documents show the need for more transparency in how Facebook pushes content.
The argument for government intervention. CNN’s Allison Morrow wrote earlier this month that if shame were going to make Facebook change, it would have done so after the 2016 presidential election or any number of other moments. So it will be up to Congress to regulate Facebook and other internet companies. The European Union has already enacted some privacy rules that affect US users.
Facebook has actually asked Congress to rewrite US policies governing big tech companies since the rules date back a quarter century, before everyone’s day-to-day life was changed by the internet.
What are some options to regulate Facebook? Haugen says there should be a government-backed regulatory agency overseeing companies like Facebook. Other industries, from banking to television, have this. The Federal Communications Commission could also be given new power.
One major issue is that Facebook and other tech companies are protected from lawsuits relating to the content on their sites, including misinformation, by Section 230 of the Communications Decency Act, a decades-old law that has drawn scrutiny from both sides of the political aisle. But Congress has not been able to come together to change the law.
The UK has a draft proposal for an online safety bill that would give new power to its communication regulatory agency, Ofcom, but that has been years in the making and is not yet on the path to becoming law.
I have not seen an in-depth proposal with much support in the US. For as much as this is a problem many people agree needs to be addressed, it feels very much like we’re at the beginning of the conversation.
Here’s a New York Times report that compares two options: creating a new federal agency for oversight of platforms like Facebook vs. passing new privacy laws. The idea is that limiting what data Facebook could collect would then limit how its algorithm pushes bad information.
During the 2020 Democratic presidential primary, Elizabeth Warren had a detailed plan to regulate not just Facebook but also Amazon, Google and others. Some have suggested requiring Facebook to break up, perhaps spinning off WhatsApp or Instagram.
The danger of the government regulating speech. Freedom of speech is the first thing guaranteed in the US Bill of Rights, and protecting it, even when it is distasteful or wrong, is sacrosanct in this country, where people can and should be able to say what they want.
Nossel cautioned that the openness of the platform is an important thing to maintain:
“I don’t want Facebook to just wipe content out without any explanation. I want people to have a recourse if they believe their ability to express themselves has been unjustifiably impaired.”
Rather than the spreading of misinformation, the gripe of many, particularly conservative pundits and politicians, is that Facebook’s moderators unduly target conservative voices and accounts.
Outright lies that go viral also hurt society. There’s a difference between simply saying something that’s wrong and having it amplified across the country and the world. A very large portion of the GOP doesn’t think Joe Biden won the presidential election, for instance. That’s a danger to democracy.