Over the past few years, Facebook has repeatedly come under fire, criticized for everything from how the company’s products psychologically harm their users to its role in disseminating fake news, especially related to the 2016 election.
Earlier this month, a Wall Street Journal investigation thrust Facebook back into the spotlight with claims that the tech giant’s top executives have not fixed its problems despite internal research detailing the platform’s negative effects, from spreading misinformation and inciting anger to damaging the mental health of teens.
Responding to the latest reports, Nick Clegg, Facebook’s vice president of global affairs, said in a statement, “At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company.”
While CNN can’t confirm Facebook’s motives or intentions, one thing is clear: on more than one occasion, Facebook officials have said one thing in public while internal documents told a different story.
Impact on children and teen’s mental health
In a March 2021 congressional hearing, Rep. Cathy McMorris Rodgers, a Washington state Republican, asked Facebook CEO Mark Zuckerberg if he agreed that “too much time in front of screens passively consuming content” such as social media was harmful to the mental health of children.
In response, Zuckerberg said, “I don’t think that the research is conclusive on that.”
“The research that we’ve seen is that using social apps to connect with other people can have positive mental-health benefits,” Zuckerberg added. “Passively consuming content doesn’t have those positive benefits to well-being but isn’t necessarily negative.”
Two months later, Instagram’s head Adam Mosseri told reporters that the research he had seen suggested the app’s negative impact on teens was “quite small.”
However, research, including studies conducted by Facebook itself, contradicts these claims. Documents uploaded to an internal message board, some more than a year before Zuckerberg’s testimony and obtained by the Wall Street Journal, indicate that Instagram, which is owned by Facebook, has negatively impacted many of its millions of users, especially young women.
In a September 2021 episode of the “The Journal” podcast, executive editor and co-host Kate Linebaugh reported that, “One internal document says that for teen girls who’d recently experienced body image issues, Instagram made those feelings worse for one in three of them.”
“Over and over again they (researchers) report that the teens say that constant comparison on Instagram is contributing to higher levels of anxiety and depression,” WSJ reporter Georgia Wells said on the podcast.
Facebook’s founding president Sean Parker even admitted at an Axios event back in 2017 that “God only knows” what social media is doing to children’s brains.
Parker said he and Zuckerberg knew early on that the “social validation feedback loop” inherent in the design and algorithms of social platforms like Facebook was “exploiting a vulnerability in human psychology,” but they “did it anyway.”
Earlier this month, Mosseri acknowledged in an interview that some of the issues highlighted in the WSJ’s investigative series “‘aren’t necessarily widespread, but their impact on people may be huge.’” Mosseri added that he was “very proud” of the company’s internal research, noting that while some features of Instagram can harm young users, “’There’s a lot of good that comes with what we do.’”
In a September 14 statement, Instagram’s Head of Public Policy Karina Newton said that they “stand by” the internal research but argued that the Wall Street Journal story “focuses on a limited set of findings and casts them in a negative light.”
Speaking at a conference shortly after the presidential election, Zuckerberg claimed the idea that fake news on Facebook influenced the election in any way was “pretty crazy.” When testifying in front of Congress in April 2018, Zuckerberg said the company learned about new Russian “information operations” on the platform “Right around the time of the 2016 election itself.”
But according to an excerpt from “An Ugly Truth: Inside Facebook’s Battle for Domination,” a book by New York Times reporters Sheera Frenkel and Cecilia Kang, the company’s security team first detected Russian activity on the platform in March, eight months before the election. The authors write that Facebook’s chief security officer at the time, Alex Stamos, “‘felt he had been trying to sound the alarm on Russia for months,’” and had briefed the people in his “’reporting chain.’” The book excerpt says Zuckerberg and COO Sheryl Sandberg weren’t notified about the extent of Russia’s meddling until Stamos briefed them in December.
As Facebook investigated the extent of Russian influence, they continued to make public statements at odds with internal documents from that time.
In July 2017, a Facebook spokesperson told CNN “we have seen no evidence that Russian actors bought ads on Facebook in connection with the election” but in September, Facebook said an internal review conducted between June 2015 and May 2017 had uncovered some 3,000 ads “connected to about 470 inauthentic accounts and pages in violation of our policies.” The company said the accounts and pages “likely operated out of Russia,” and although the “vast majority” didn’t specifically reference the election, some did. Overall, the ads “appeared to focus on amplifying divisive social and political messages across the ideological spectrum.”
Speaking exclusively to CNN’s Laurie Segall in March 2018, shortly after news broke about Cambridge Analytica scraping user data, Zuckerberg said ” I think what’s clear is that in 2016, we were not as on top of a number of issues as we should have [been] whether it was Russian interference or fake news.”
Pivot to video
When Facebook’s public statements diverge widely from conclusions based on internal documents and research, it can have big ripple effects. Facebook’s contradictory messaging has had consequences, from impacting an election to shaping business models.
In April 2016, shortly before the company announced new features for its Facebook Live videos, Zuckerberg told BuzzFeed News, “We’re entering this new golden age of video.” Two months later, Nicola Mendelsohn, Facebook’s top executive in Europe, told attendees at a Fortune Magazine conference in London that content on the platform would likely be “all video” within five years.
In September of that year, Facebook disclosed that it had miscalculated, and in some cases overstated, several metrics relied on by advertisers, including the average time users spent watching videos. This error likely cost both advertisers and companies who had heeded the call to pivot to video, including many media organizations who fired writers.
Facebook claimed they discovered the error about a month before they announced it publicly.
A class action lawsuit initially filed in 2016 cited internal documents unsealed in 2018 as proof that Facebook engineers knew that the metrics related to video watch time had been over-reported more than a year before they acknowledged it publicly.
According to the amended complaint, Facebook told advertisers they had “‘recently discovered a discrepancy’” while “personnel internally emphasized that ‘we didn’t recently discover a discrepancy.’”
The lawsuit, which initially claimed that Facebook’s inaccurate metrics represented unfair business conduct and later added the fraud claim, was ultimately settled. Under the terms of the settlement proposed in October 2019, Facebook agreed to pay $40 million and acknowledge there was a metric calculation error but not admit wrongdoing regarding any of the other allegations.
Facebook has also had to contend with a separate lawsuit over allegations that it knew about problems with another metric called potential reach before it was disclosed to advertisers.
A lawsuit filed in October 2018 alleged that Facebook knew about the errors before they were first publicized, citing internal emails in which Sandberg acknowledged that she had been aware of the problems with the potential reach metric for several years.
These internal documents cited in the lawsuit and unsealed in February 2021 suggest Facebook may have known about the error at least a year before the push for Facebook Live and even prior to Zuckerberg’s comments about the “golden age of video.”
Facebook spokesperson Joe Osborne told CNN Business at the time, “These documents are being cherry-picked to fit the plaintiff’s narrative.”
Osborne called potential reach “a helpful campaign planning tool that advertisers are never billed on,” adding “It’s an estimate and we make clear how it’s calculated in our ads interface and Help Center.”