Facebook whistleblower testifies in Congress

By Clare Duffy, Melissa Macaya, Mike Hayes, Samantha Murphy Kelly, Veronica Rocha, Adrienne Vogt and Aditi Sangal, CNN

Updated 6:13 p.m. ET, October 5, 2021
28 Posts
Sort byDropdown arrow
12:57 p.m. ET, October 5, 2021

Whistleblower: "I have strong national security concerns about how Facebook operates today"

From CNN Business' Clare Duffy

Haugen, whose last role at Facebook was as a product manager supporting the company’s counter-espionage team, was asked whether Facebook is used by “authoritarian or terrorist-based leaders” around the world. 

She said such use of the platforms is “definitely” happening, and that Facebook is “very aware” of it.  

“My team directly worked on tracking Chinese participation on the platform, surveilling, say, Uyghur populations, in places around the world. You could actually find the Chinese based on them doing these kinds of things,” Haugen said. “We also saw active participation of, say, the Iran government doing espionage on other state actors.” 

She went on to say that she believes, “Facebook’s consistent understaffing of the counterespionage information operations and counter terrorism teams is a national security issue, and I’m speaking to other parts of Congress about that … I have strong national security concerns about how Facebook operates today.”  

Sen. Richard Blumenthal suggested that these national security concerns could be the subject of a future subcommittee hearing.

1:00 p.m. ET, October 5, 2021

Whistleblower: Transparency and dissolving an engagement-based ranking system would improve Facebook

From CNN's Aditi Sangal

Facebook whistleblower Frances Haugen detailed how the social media platform could become a better environment, saying it could introduce transparency measures and small frictions, and move away from the "dangerous" engagement-based ranking system.

This would recenter the methods of amplification and won't focus not on "picking winners or losers in the marketplace of ideas," she told senators on Tuesday.

"On Twitter, you have to click through on a link before you reshare it. Small actions like that friction don’t require picking good ideas and bad ideas, they just make the platform less twitchy, less reactive. Facebook’s internal research says each one of these small actions, dramatically reduces misinformation, hate speech and violence-inciting content on the platforms," Haugen said.

She advocated for chronologically-ordered content instead.

"I'm a strong proponent of chronological ranking, ordering by time, with a little bit of spammed emotion. Because I think we don't want computers deciding what we focus on, we should have software that is human-scaled, or humans have conversations together, not computers facilitating who we get to hear from," Haugen said.

In addition, she encouraged a privacy-conscious regulatory body working with academics, researchers and government agencies to "synthesize requests for data" because currently, the social media giant is not obligated to disclose any data.

"Even data as simple as what integrity systems exist today and how well do they perform?" she suggested. "Basic actions like transparency would make a huge difference."

What does engagement-based ranking system mean?

Facebook, like other platforms, uses algorithms to amplify or boost content that receives engagement in the form of likes or shares or comments. In Facebook’s view, this helps a user “enjoy” their feed, Haugen explained.

“The dangers of engagement-based ranking are that Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment or reshare," which aren't necessary for the user's benefit, she added. "It's because they know that other people will produce more content if they get the likes and comments and reshares. They prioritize content in your feed so that you will give little hits of dopamine to your friends so they'll create more content.”

1:00 p.m. ET, October 5, 2021

Whistleblower: Facebook should declare "moral bankruptcy" and ask Congress for help

Former Facebook employee Frances Haugen said it's time for Facebook to declare "moral bankruptcy" and admit that they have a problem and they need help solving it.

She said that since Facebook is a "closed system" the company has "had the opportunity to hide their problems."

"And like people often do when they can hide their problems, they get in over their heads," Haugen added.

She said that Congress should step in and say to the company, "You don't have to hide these things from us" and "pretend they're not problems."

She believes that Congress should give Facebook the opportunity to "declare moral bankruptcy and we can figure out how to fix these things together."

Asked to clarify what she meant by "moral bankruptcy," Haugen said she envisioned a process like financial bankruptcy where "they admit did something wrong" and there is a "mechanism" to "forgive them" and "move forward."

"Facebook is stuck in a feedback loop that they cannot get out of...they need to admit that they did something wrong and that they need help to solve these problems. And that's what moral bankruptcy is," she said.

12:11 p.m. ET, October 5, 2021

The Senate hearing is back after a short break

From CNN's Samantha Murphy Kelly and Clare Duffy

The Senate hearing with Facebook whistleblower Frances Haugen has resumed.

The former Facebook product manager who worked on civic integrity issues at the company has been facing questions from a Commerce subcommittee about what Facebook-owned Instagram knew about its effects on young users, among other issues.

"I am here today because I believe that Facebook's products harm children, stoke division, and weaken our democracy," she said during her opening remarks. "The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won't solve this crisis without your help."

She emphasized that she came forward "at great personal risk" because she believes "we still have time to act. But we must act now."

Read more about today's hearing here.

11:45 a.m. ET, October 5, 2021

Whistleblower: Facebook's artificial intelligence systems only catch "very tiny minority" of offending content

From CNN's Aditi Sangal

Facebook's artificial intelligence (AI) systems "only catch a very tiny minority of offending content," whistleblower Frances Haugen told congressional lawmakers on Tuesday.

"The reality is that we've seen from repeated documents within my disclosures, is that Facebook's AI systems only catch a very tiny minority of offending content. And best case scenario, and the case of something like hate speech, at most they will ever get 10 to 20%. In the case of children, that means drug paraphernalia ads like that, it's likely if they rely on computers and not humans, they will also likely never get more than 10 to 20% of those ads," Haugen explained.

She said the reason behind it is Facebook's "deep focus on scale."

"So scale is, 'can we do things very cheaply for a huge number of people?' Which is part of why they rely on AI so much. It's possible none of those ads were seen by a human," she said.

11:57 a.m. ET, October 5, 2021

Facebook struggles to tackle problems because it's "understaffed," whistleblower says

From CNN's Brian Stelter

Facebook is extraordinarily profitable, so it is intriguing to hear Frances Haugen repeatedly refer to the company as "understaffed." She said this staffing shortage contributes to a vicious cycle of platform-wide problems.

"Facebook has struggled for a long time to recruit and retain the number of employees it needs to tackle the large scope of projects that it has chosen to take on," Haugen said, emphasizing the word "chosen."

"Facebook is stuck in a cycle where it struggles to hire; that causes it to understaff projects; which causes scandals; which then makes it harder to hire," she said.

In a later exchange, Haugen described the following "pattern of behavior:" Often, she said, "problems were so understaffed that there was kind of an implicit discouragement from having better detection systems." For example, "my last team at Facebook was on the counterespionage team within the threat intelligence org, and at any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detecter, we would likely have many more cases."

It's a twist on the adage about being "too big to fail." Longtime tech reporter Craig Silverman observed that Haugen was calling Facebook "too big to staff."

11:36 a.m. ET, October 5, 2021

Whistleblower: Bullying on Instagram follows kids home

From CNN's Aditi Sangal

Facebook whistleblower Frances Haugen told Senate lawmakers that Instagram has changed home lives for children.

"The kids who are bullied on Instagram, the bullying follows them home. It follows them into their bedrooms. The last thing they see before they go to bed at night is someone being cruel to them. Or the first thing in the morning is someone being cruel to them. Kids are learning that their own friends, people who they care about, are cruel to them," she said.

This could potentially impact their domestic relationships as they grow older, Haugen added.

"Facebook's own research is aware that children express feelings of loneliness and struggling with these things because they can't even get support from their own parents" who have never had this experience with technology, the former Facebook employee added. "I don't know understand how Facebook can know all these things and not escalate it to someone like Congress for help and support in navigating these problems."

Watch here:

11:31 a.m. ET, October 5, 2021

Facebook spokesperson weighs in on Haugen's testimony

From CNN Business' Clare Duffy

As lawmakers questioned whistleblower Frances Haugen about how Facebook attracts and treats young users, Facebook spokesperson Andy Stone pushed back in real time on Twitter. He tweeted that Haugen did not directly work on child safety issues at the company.

Haugen has been transparent about the fact that she did not work on child safety issues at Facebook; she noted in one answer that although she has some knowledge of the issue, she did not work directly on it. However, Haugen provided extensive internal documentation related to Facebook's research on the topic to lawmakers.

11:27 a.m. ET, October 5, 2021

Facebook whistleblower says 2020 election misinformation caused her to speak out

From CNN's Adrienne Vogt

Facebook whistleblower Frances Haugen said the impetus for her to speak out came after the 2020 presidential election.

"There was a long series of moments where I became aware that Facebook — when faced with conflicts of interest between its own profits and the common good, public safety — that Facebook consistently chose to prioritize its profits. I think the moment which I realized we needed to get help from the outside, that the only way these problems would be solved would be by solving them together, was when civic integrity was dissolved following the 2020 election," Haugen told lawmakers at a Senate hearing.

"It really felt like a betrayal of the promises that Facebook had made to people who had sacrificed a great deal to keep the election safe, by basically dissolving our community and integrating in just other parts of the company," she said.

Some more background: Haugen, 37, joined Facebook in 2019 to work on civic integrity, including "issues related to democracy and misinformation," according to her website. Those issues have been front and center for critics of Facebook and other social media companies, particularly around the coronavirus pandemic and the 2020 US Presidential Election.

Haugen took the job at Facebook to work on addressing misinformation, she said in a "60 Minutes" interview Sunday. Similarly to today's hearing, she said her feelings about the company started to change when it decided to dissolve its civic integrity team shortly after the election.

CNN's Rishi Iyengar contributed reporting to this post.