Internal Facebook documents revealed

By Clare Duffy, Aditi Sangal, Melissa Mahtani and Meg Wagner, CNN

Updated 5:55 p.m. ET, October 26, 2021
24 Posts
Sort byDropdown arrow
11:22 a.m. ET, October 26, 2021

Here are the big takeaways from the Facebook Papers

From CNN's Tara Subramaniam

While Facebook has repeatedly come under fire over the past few years for its role in disseminating misinformation, especially related to the 2016 election, the last two months have been especially turbulent as a whistleblower and top officials have been called to testify in front of Congress following the release of leaked internal research and documents.

These disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel have shed new light on the inner workings of the tech giant. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions of the documents received by Congress. She also shared some of the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its platforms.

Facebook has pushed back on Haugen's assertions, with CEO Mark Zuckerberg even issuing a 1,300-word statement suggesting that the documents are cherry picked to present a misleading narrative about the company.

Here are some key takeaways from the tens of thousands of pages of internal documents.

Spread of misinformation

In one SEC disclosure, Haugen alleges "Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection."

And leaked comments from some Facebook employees on January 6 suggest the company might have had some culpability in what happened by not moving more quickly to halt the growth of Stop the Steal groups.

In response to these documents a Facebook spokesperson told CNN, "The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them."

Global lack of support

Internal Facebook documents and research shared as part of Haugen's disclosures highlight gaps in Facebook's ability to prevent hate speech and misinformation in countries such as Myanmar, Afghanistan, India, Ethiopia and much of the Middle East, where coverage of many local languages is inadequate.

Human Trafficking

Facebook has known about human traffickers using its platforms since at least 2018, but has struggled to crack down on related content, company documents reviewed by CNN show.

Inciting Violence Internationally

Internal documents indicate Facebook knew its existing strategies were insufficient to curb the spread of posts inciting violence in countries "at risk" of conflict, like Ethiopia.

This is not the first time concerns have been raised about Facebook's role in the promotion of violence and hate speech. After the United Nations criticized Facebook's role in the Myanmar crisis in 2018, the company acknowledged that it didn't do enough to prevent its platform being used to fuel bloodshed, and Zuckerberg promised to increase Facebook's moderation efforts.

Impact on Teens

According to the documents, Facebook has actively worked to expand the size of its young adult audience even as internal research suggests its platforms, particularly Instagram, can have a negative effect on their mental health and well-being.

Although Facebook has previously acknowledged young adult engagement on the Facebook app was "low and regressing further," the company has taken steps to target that audience.

However, Facebook's internal research, first reported by the Wall Street Journal, claims Facebook's platforms "make body image issues worse for 1 in 3 teen girls." Its research also found that "13.5% of teen girls on Instagram say the platform makes thoughts of 'Suicide and Self Injury' worse" and 17% say the platform, which Facebook owns, makes "Eating Issues" such as anorexia worse.

Algorithms fueling divisiveness

A late 2018 analysis of 14 publishers on the social network, entitled "Does Facebook reward outrage," found that the more negative comments incited by a Facebook post, the more likely the link in the post was to get clicked.

"The mechanics of our platform are not neutral," one staffer wrote.

Get the full analysis and list of takeaways here.

11:21 a.m. ET, October 26, 2021

Facebook has language blind spots around the globe that allow hate speech to flourish

From CNN's Rishi Iyenger

A passenger looks at his mobile phone on a local train in the state of West Bengal in Kolkata, India, on Nov. 11, 2020. 
A passenger looks at his mobile phone on a local train in the state of West Bengal in Kolkata, India, on Nov. 11, 2020. 

For years, Facebook CEO Mark Zuckerberg touted his mission to connect the entire world �� and his company has come closer than perhaps any other to fulfilling that lofty goal, with more than 3 billion monthly users across its various platforms. But that staggering global expansion has come at a cost.

Facebook's own researchers have repeatedly warned that the company appears ill-equipped to address issues such as hate speech and misinformation in languages other than English, potentially making users in some of the most politically unstable countries more vulnerable to real-world violence, according to internal documents viewed by CNN.

The documents are part of disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions received by Congress.

Many of the countries that Facebook refers to as "At Risk" — an internal designation indicating a country's current volatility — speak multiple languages and dialects, including India, Pakistan, Ethiopia and Iraq. But Facebook's moderation teams are often equipped to handle only some of those languages and a large amount of hate speech and misinformation still slips through, according to the documents, some of which were written as recently as this year.

While Facebook's platforms support more than 100 different languages globally, its global content moderation teams do not. A company spokesperson told CNN Business that its teams are comprised of "15,000 people who review content in more than 70 languages working in more than 20 locations" around the world. Even in the languages it does support, the documents show several deficiencies in detecting and mitigating harmful content on the platform.

There are also translation problems for users who may want to report issues. One research note, for example, showed that only a few "abuse categories" for reporting hate speech in Afghanistan had been translated into the local language Pashto. The document was dated Jan. 13, 2021, months before the Taliban militant group's takeover of the country.

The documents, many of which detail the company's own research, lay bare the gaps in Facebook's ability to prevent hate speech and misinformation in a number of countries outside the United States, where it's headquartered, and may only add to mounting concerns about whether the company can properly police its massive platform and prevent real-world harms.

Facebook has invested a total of $13 billion since 2016 to improve the safety of its platforms, according to the company spokesperson. (By comparison, the company's annual revenue topped $85 billion last year and its profit hit $29 billion.) The spokesperson also highlighted the company's global network of third party fact-checkers, with the majority of them based outside the United States.

Read the full story here.

5:57 p.m. ET, October 25, 2021

Zuckerberg strikes defensive tone on Facebook's earnings call 

From CNN's Clare Duffy and Richard Davis

CEO Mark Zuckerberg kicked off Facebook’s quarterly earnings call by addressing the latest wave of coverage based on a trove of leaked internal documents on Monday.

“Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company,” he said. “The reality is that we have an open culture that encourages discussion and research on our work so we can make progress on many complex issues that are not specific just to us.”

The earnings come amid perhaps the biggest crisis in the social media giant’s 17-year history.

Tens of thousands of pages of internal documents leaked by whistleblower Frances Haugen informed the Wall Street Journal’s “Facebook Files” series and, on Monday, a flood of additional news coverage by a consortium of 17 US news organizations, as well as hearings with US and UK lawmakers. 

The documents provide the deepest look yet at many of Facebook’s biggest problems, including its struggles to regulate hate speech and misinformation, the use of its platform by human traffickers, research on harms to young people and more.

Facebook has repeatedly pushed back on many of the reports, saying they are misleading and mischaracterize its research and actions.

 Zuckerberg last commented on the situation following Haugen’s Senate subcommittee hearing earlier this month, in a statement wherein he tried to discredit Haugen. Still, on Friday, another former Facebook employee anonymously filed a complaint against the company to the SEC, with allegations similar to Haguen’s.

Despite all the bad headlines, the company posted another quarter of massive earnings.

5:35 p.m. ET, October 25, 2021

Facebook reports over $9 billion quarterly profit amid damning headlines

From CNN's Clare Duffy

On a day full of bad news for Facebook, the company reminded investors that it continues to be a money-making machine.

Facebook on Monday reported $29 billion in revenue for the three months ended in September, up 33% from the same period a year earlier. The company posted nearly $9.2 billion in profit, up 17% from the year prior.

The results were nearly in line with Wall Street analysts' projections. Facebook's stock rose more than 3% in after-hours trading Monday following the earnings report.

The results come amid perhaps the biggest crisis in the social media giant's 17-year history. Tens of thousands of pages of internal documents leaked by whistleblower Frances Haugen informed the Wall Street Journal's "Facebook Files" series, and on Monday, a flood of additional news coverage by a consortium of 17 US news organizations, as well as hearings with US and UK lawmakers. The documents provide the deepest look yet at many of Facebook's biggest problems, including its struggles to regulate hate speech and misinformation, the use of its platform by human traffickers, research on harms to young people and more. (Facebook has pushed back on many of the reports, saying they are misleading and mischaracterize its research and actions.)

Still, Facebook is no stranger to PR crises. In most cases, Facebook's business has continued to chug along at a healthy clip despite outcry from regulators and the public.

But this time could be different. Facebook's massive ad business is already in a vulnerable state because of recent changes to Apple's app tracking rules. Apple's iOS 14.5 software update, which went into effect in April, requires that users give explicit permission for apps to track their behavior and sell their personal data, such as age, location, spending habits and health information, to advertisers. Facebook has aggressively pushed back against the changes and warned investors last year that the update could hurt its business if many users opt out of tracking.

On Monday, Facebook warned that the iOS 14 changes could create "continued headwinds" in the fourth quarter of 2021.

While much of the world spent the day focused on Facebook's real-world harms, the company hinted to investors in the report that it wants them looking forward, not backward. Starting in the fourth quarter, the company plans to break out Facebook Reality Labs — its division dedicated to augmented and virtual reality services — as a separate reporting segment from its family of apps, which includes Instagram, WhatsApp and Facebook's namesake social network.

CFO David Wehner said Facebook is investing so heavily in this newer division that it will reduce "our overall operating profit in 2021 by approximately $10 billion."

In a statement with the results, Facebook CEO and cofounder Mark Zuckerberg also focused on what's next: "I'm excited about our roadmap, especially around creators, commerce, and helping to build the metaverse."

5:28 p.m. ET, October 25, 2021

Anti-Defamation League blasts Facebook: Never has a single company been responsible for so much misfortune

From CNN's Matt Egan 

Jonathan Greenblatt, the CEO of the Anti-Defamation League, blasted Facebook on Monday following the publication of a series of articles revealing the company’s struggles to stop hate speech, human trafficking and coordinated groups that sowed discord ahead of the Jan. 6 insurrection.

“The kind of monopolistic indifference the company has demonstrated in dealing with hate is mind-bending,” Greenblatt told CNN in a phone interview. “I don’t think ever before a single company has been responsible for so much misfortune.”

Greenblatt said the ADL is in talks with members of its coalition to “explore the appropriate response” to the Facebook Papers. “There are things advertisers can do to demonstrate their discontent,” he said. 

Last year the ADL helped launch Stop Hate for Profit, a campaign that called on major companies to pause advertising on Facebook for failures to address the incitement of violence on the platform. Hundreds of companies eventually joined the ad boycott.

“Advertisers, from Fortune 500 companies to small businesses, need to ask themselves: Do they want to continue to invest in a platform that is knowingly pushing out misinformation and hate and that seems designed more to divide than convene?” Greenblatt said. “Companies can vote with their wallets and decide where they want to build their brands, redirecting resources away from Facebook.”

The comments come after a consortium of 17 US news organizations began publishing the Facebook Papers, a series of stories based on a trove of hundreds of internal documents that were included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen’s legal counsel. The consortium, which includes CNN, reviewed the redacted versions received by Congress.

CNN's coverage includes stories about how coordinated groups on Facebook sow discord and violence, including on Jan. 6, as well as Facebook's challenges moderating content in some non-English-speaking countries, and how human traffickers have used its platforms to exploit people.

“The news is stunning, but not shocking,” Greenblatt said. “Mark Zuckerberg would have you believe [Facebook] was doing all it could. Now we know the truth: He was aware and did nothing about it.”

As CNN reported on Friday, the Facebook Papers suggest the company was fundamentally unprepared for how the Stop the Steal movement used its platform to organize ahead of the Jan. 6 insurrection.

“They misled investors and the public about the spread of misinformation that led to the January 6insurrection,” Greenblatt said.

Greenblatt had his own spin on Facebook CEO Mark Zuckerberg’s famous “move fast and break things” motto for his company: “Move fast and lie about things.”

“We know they continually misled the public, misled the press, misled organizations like mine about the steps they were taking to deal with the hate on their service,” Greenblatt said.

Facebook did not respond to a request for comment but the company has denied the premise of Haugen’s conclusions around the company’s role in the Jan. 6 insurrection.

"The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them. We took steps to limit content that sought to delegitimize the election, including labeling candidates' posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising and removing the original #StopTheSteal Group in November," Facebook spokesperson Andy Stone told CNN Friday.

Facebook also published a blog post detailing its efforts around the 2020 election. 

Still, Greenblatt said the current moment is an opportunity for business leaders to speak up about the problems at Facebook.

“Whether you’re a corporation or a celebrity or an elected official, all of us have a stake in getting this right,” he said. “Unfortunately, Facebook’s problem is all of our problem.”

However, Greenblatt suggested the best path, at this point, is to pursue regulatory changes through Congress and government agencies

“I believe in self-regulation,” Greenblatt said, pointing to his own career in Silicon Valley. “But Facebook has proven itself incapable of demonstrating the kind of responsibility we expect for a company of any size, let alone one of its sheer scale.”

Watch CNN reporting:

4:45 p.m. ET, October 25, 2021

Former Facebook employee Sophie Zhang said she felt like there was "blood on her hands" after working there

From CNN's Donie O'Sullivan

This is not the first time that Facebook's employees and former workers are complaining about the company's troubling practices and culture.

Sophie Zhang, who worked as a data scientist at the tech giant for almost three years, she felt like she had "blood on her hands" after working there.

Zhang wrote a lengthy memo when she was fired by Facebook last year, detailing how she believed the company was not doing enough to tackle hate and misinformation — particularly in smaller and developing countries. Zhang said the company told her she was fired because of performance issues.

The memo was first reported last year by BuzzFeed News and later helped form the basis of a series of reports by The Guardian newspaper.

She is willing to testify before Congress about her former employer, she told CNN following whistleblower Frances Haugen's testimony. She said she had also passed on documentation about the company to a US law enforcement agency.

"I provided detailed documentation regarding potential criminal violations to a U.S. law enforcement agency. My understanding is that the investigation is still ongoing," she tweeted.

Central to Zhang's allegations about Facebook is that it doesn't do enough to tackle abuse of its platform in countries outside of the United States. Roughly 90% of Facebook's monthly active users are outside the US and Canada, according to its most recent quarterly filing.

Read the full story here.

Watch more:

1:39 p.m. ET, October 25, 2021

Facebook knew it was being used to incite violence in Ethiopia. Little was done to stop it.

From CNN's Eliza Mackintosh

Mekelle, the regional capital of Tigray, in northern Ethiopia, is seen through a bullet hole at the Ayder Referral Hospital, in May 2021.
Mekelle, the regional capital of Tigray, in northern Ethiopia, is seen through a bullet hole at the Ayder Referral Hospital, in May 2021.

Facebook employees repeatedly sounded the alarm on the company's failure to curb the spread of posts inciting violence in "at risk" countries like Ethiopia, where a civil war has raged for the past year, internal documents seen by CNN show.

The social media giant ranks Ethiopia in its highest priority tier for countries at risk of conflict, but the documents reveal that Facebook's moderation efforts were no match for the flood of inflammatory content on its platform.

The documents are among dozens of disclosures made to the US Securities and Exchange Commission (SEC) and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions received by Congress.

They show employees warning managers about how Facebook was being used by "problematic actors," including states and foreign organizations, to spread hate speech and content inciting violence in Ethiopia and other developing countries, where its user base is large and growing. Facebook estimates it has 1.84 billion daily active users — 72% of which are outside North America and Europe, according to its annual SEC filing for 2020.

For example, an internal report distributed in March, entitled "Coordinated Social Harm," said that armed groups in Ethiopia were using the platform to incite violence against ethnic minorities in the "context of civil war."

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

Read the full story about what's going on in Ethiopia, what Facebook knew about violent actors and the company's insufficient mitigation strategies.

1:08 p.m. ET, October 25, 2021

Facebook employees flagged people for sale on its platforms in 2018. It's still a problem.

From CNN Business' Clare Duffy

Facebook has for years struggled to crack down on content related to what it calls domestic servitude: "a form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception," according to internal Facebook documents reviewed by CNN. 

The company has known about human traffickers using its platforms in this way since at least 2018, the documents show. It got so bad that in 2019, Apple threatened to pull Facebook and Instagram's access to the App Store, a platform the social media giant relies on to reach hundreds of millions of users each year. Internally, Facebook employees rushed to take down problematic content and make emergency policy changes avoid what they described as a "potentially severe" consequence for the business.  

But while Facebook managed to assuage Apple's concerns at the time and avoid removal from the app store, issues persist. The stakes are significant: Facebook documents describe women trafficked in this way being subjected to physical and sexual abuse, being deprived of food and pay, and having their travel documents confiscated so they can't escape. Earlier this year, an internal Facebook report noted that "gaps still exist in our detection of on-platform entitles engaged in domestic servitude" and detailed how the company's platforms are used to recruit, buy and sell what Facebook's documents call "domestic servants." 

Last week, using search terms listed in Facebook's internal research on the subject, CNN located active Instagram accounts purporting to offer domestic workers for sale, similar to accounts that Facebook researchers had flagged and removed. Facebook removed the accounts and posts after CNN asked about them, and spokesperson Andy Stone confirmed that they violated its policies.

"We prohibit human exploitation in no uncertain terms," Stone said. "We've been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform."

Read more here.

1:37 p.m. ET, October 25, 2021

Here's a recap of the Facebook whistleblower's testimony in the UK parliament

Facebook whistleblower Frances Haugen leaves after giving evidence to the joint committee for the Draft Online Safety Bill, as part of British government plans for social media regulation, at the Houses of Parliament in London, Monday, Oct. 25, 2021. 
Facebook whistleblower Frances Haugen leaves after giving evidence to the joint committee for the Draft Online Safety Bill, as part of British government plans for social media regulation, at the Houses of Parliament in London, Monday, Oct. 25, 2021. 

Facebook whistleblower Frances Haugen's testimony in the UK parliament has concluded. She raised concerns about the company's focus on algorithms, its approach to misinformation and hate speech moderation. She also reiterated her call for regulating the tech giant to get more transparency, which she previously spoke about in her testimony to US Congress.

Here's a recap of what Haugen told the UK parliament:

Facebook's under-invests in content safety systems for non-English languages.

"Facebook says things like, 'we support 50 languages,' when in reality, most of those languages get a tiny fraction of the safety systems that English gets," Haugen told British lawmakers. "UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually [under-enforced] in the UK."

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

Facebook should not be allowed to "mislead" its Oversight Board.

"I hope the Oversight Board takes this moment to stand up and demand a relationship that has more transparency," said Haugen. "If Facebook can come in there and just actively mislead the Oversight Board — which is what they did — I don’t know what the purpose of the oversight board is."

The board adjudicates cases on controversial content that is both left up or taken down — but these cases are just "the tip of the iceberg" when it comes to oversight at Facebook, Oversight Board member and PEN America CEO Suzanne Nossel said.

The UK is leading the world in its efforts to regulate social media platforms through its Draft Online Safety Bill.

Haugen said she couldn't imagine that Facebook CEO and founder Mark Zuckerberg "isn't paying attention" to the efforts.

While countries in the "Global South" do "not have the resources to stand up and save their own lives," the UK has the chance to take a "world leading stance" with its bill, which seeks to impose a duty of care on social media sites towards their users, Haugen added.

Facebook views safety as a cost center instead of a growth center.

"I think there is a view inside the company that safety is a cost center; it's not a growth center, which, I think, is very short-term in thinking. Because Facebook's own research has shown that when people have worse integrity experiences on the site, they are less likely to retain," she said Monday.

She urged British lawmakers to put regulations in place, saying it was for the good of the company's long-term growth.

"I think regulation could actually be good for Facebook's long-term success. Because it would force Facebook back into a place where it was more pleasant to be on Facebook," she said.