Following congressional testimony from whistleblower Frances Haugen, in a Facebook post Tuesday night the company’s CEO, Mark Zuckerberg, pushed back against Haugen’s allegations that the tech giant is hiding research about its shortcomings from investors and the public.
In a 1,300-word statement, Zuckerberg defended Facebook’s services and, appealing to its employees, suggested that both their work and the company as a whole were being misrepresented.
Here’s a look at what Zuckerberg claimed and what we know.
Impact on teens
In response to claims from Haugen and an investigation by the Wall Street Journal that Instagram, which is owned by Facebook, has negatively impacted many of its millions of users, especially young women, Zuckerberg cited a previous Facebook statement on the issue.
“‘The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced,” the statement said. “’In fact, in 11 of 12 areas on the slide referenced by the Journal – including serious areas like loneliness, anxiety, sadness and eating issues – more teenage girls who said they struggled with that issue also said Instagram made those difficult times better rather than worse.’”
Facts First: While it’s true that many teens surveyed may have reported having a positive experience with Instagram, that should not invalidate or outweigh the fact that it does have a negative impact on a sizable portion of its users. Both documents released by Haugen and other studies indicate that Instagram in particular can be harmful to teens, young girls especially, and in some cases exacerbate the very issues Zuckerberg says it helps make better.
According to Facebook’s own internal research cited in one of Haugen’s filings to the Securities and Exchange Commission, “13.5% of teen girls on Instagram say the platform makes thoughts of ‘Suicide and Self Injury’ worse” and 17% say the platform makes “Eating Issues” such as anorexia worse. Its research also claims Facebook’s platforms “make body image issues worse for 1 in 3 teen girls.”
This research is supported by experiments conducted by Sen. Richard Blumenthal’s office and replicated by CNN.
CNN’s Donie O’Sullivan, who covers Facebook, told CNN’s John Berman on “New Day” that within a week of setting up an Instagram account as a 13-year-old girl that followed a few accounts about dieting and related topics, “Instagram’s algorithm, the algorithm that Zuckerberg controls, is pounding that account now with suggestions, more and more and more, pro-eating disorder, pro-anorexia accounts.”
Addressing the impact of Facebook’s platforms on teens during her testimony, Haugen said, “In the case of cigarettes, only about 10% of people who smoke ever get lung cancer, so the idea that 20% of your users could be facing serious mental health issues and that’s not a problem is shocking.”
“If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?” Zuckerberg’s statement claimed.
Facts First: This is misleading at best. There’s evidence beyond Haugen’s testimony that Facebook has hidden reports or results in the past.
For example, after Facebook released a quarterly report in August indicating the most popular posts on its platform were positive and innocuous, like recipes, The New York Times first reported that the company had shelved a report from the first three months of 2021 because of concerns it would look bad for the company. In that version, the Times reported, the top link on the platform was a piece of anti-vaccine information.
Facebook ultimately came clean, with company spokesperson Andy Stone saying, “We’re guilty of cleaning up our house a bit before we invited company.”
Efforts to address harmful content
Zuckerberg wrote, “If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space – even ones larger than us?”
Facts First: This needs context. Facebook has known about the need to fight harmful content on its platform for years but has not committed to hiring and training full-time Facebook employees to address it.
Facebook does have content moderators but many of them are contractors or subcontractors, not officially employed by the company. As a result, they’re paid less and don’t have access to several of the perks that normally come along with working for Facebook.
“If they really cared about this issue, if they really wanted to bring it up to that standard, why not spend that additional money? Bring more people in house?” O’Sullivan said. “Facebook will probably say this is a scale issue, but this issue has been on their platform for a decade now; what’s delaying them?”