Instagram head testifies before Congress

By Aditi Sangal, Clare Duffy, Brian Fung and Samantha Kelly, CNN

Updated 2353 GMT (0753 HKT) December 8, 2021
9 Posts
Sort byDropdown arrow
3:40 p.m. ET, December 8, 2021

Instagram head admits there's a loophole in teen account privacy default

From CNN's Clare Duffy

(Olivier Douliery/AFP/Getty Images)
(Olivier Douliery/AFP/Getty Images)

Instagram head Adam Mosseri admitted to lawmakers that there was a loophole in a default privacy setting meant to protect teens on the platform.

For users under the age of 16, the company has said that newly created accounts are private — meaning other users must request to follow them in order to view their content — by default. But GOP Sen. Marsha Blackburn of Tennessee, ranking member of the subcommittee, said her office created an account for a hypothetical 15-year-old girl ahead of the hearing on the web browser version of Instagram and the account was set to public by default.

"Isn't the opposite supposed to happen?" Blackburn said "And have you considered turning off the public option altogether for minor accounts?"

Mosseri said he learned of the issue with the privacy default Wednesday morning, hours before the hearing.

It turns out that we default those under the age of 16 to private accounts for the vast majority of accounts, which are created on Android and iOS, but we have missed that on the web and we will correct that quickly," Mosseri said, skirting the second half of Blackburn's question.
3:22 p.m. ET, December 8, 2021

Read some excerpts from Adam Mosseri's testimony on Instagram's impact on kids

Head of Instagram Adam Mosseri testifying at a US Senate hearing in Washington, DC, on December 8, 2021.
Head of Instagram Adam Mosseri testifying at a US Senate hearing in Washington, DC, on December 8, 2021. (Brendan Smialowski/AFP/Getty Images)

Adam Mosseri's prepared testimony covers a range of steps Instagram has taken to help keep users safe, from making it harder for young people to receive unwanted messages from adults to restricting advertising so that marketers can only target minors based on age, gender and location.

Here are a few key excerpts from the head of Instagram's opening remarks, where he stated he firmly believes that Instagram "can be a positive force in young people's lives."

On using age verification as a way of keeping young users safe:

Mosseri stated that users under the age of 13 are not permitted on Instagram, and that the platform is trying to build new technologies to find a remove accounts belonging to those under 13.

Additionally, Instagram will launch its first set of controls for parents and guardians in March, he said.

Calling for more industry regulation:

Mosseri also stated his support for updated regulations to keep people safe online.

"Specifically, we believe there should be an industry body that will determine best practices when it comes to at least three questions: how to verify age, how to build age-appropriate experiences, how to build parental controls. The body should receive input from civil society, from parents, and from regulators. The standards need to be high and the protections universal. And I believe that companies like ours should have to earn some of the Section 230 protections by adhering to those standards," he said in his prepared testimony.
3:09 p.m. ET, December 8, 2021

Here are a few excerpts from GOP Sen. Marsha Blackburn's opening remarks

(U.S. Senate Committee on Commerce, Science, and Transportation)
(U.S. Senate Committee on Commerce, Science, and Transportation)

Republican Sen. Marsha Blackburn, who is the ranking member of the Senate subcommittee, expressed her frustration around big tech at the hearing. Here are a few excerpts from her opening remarks:

"I want to be honest and tell you that I am a bit frustrated today … Tennesseans want Big Tech to be more transparent and to accept responsibility for your actions. And time and time again, you say things that make it sound like you are hearing us and agree – but then nothing changes," she said to Adam Mosseri, head of Instagram.

Blackburn also criticized the latest set of product updates Instagram released, calling them "half measures."

"Yesterday, at 3:00 a.m. – which is midnight in Silicon Valley – you released a list of product updates you said would 'raise the standard for protecting teens and supporting parents online.' I’m not sure what hours you all keep in California. But where I’m from, the middle of the night is when you drop news that you don’t want people to see," she said Wednesday. "While I’m sure you know that we fully share the goal of protecting kids and teens online, what we aren’t sure about is how the half measures you’ve introduced are going to get us to the point where we need to be."

The measures are also "a case of too little, too late," she added.

"Because now there is bipartisan momentum – both here and in the House – to tackle these problems we are seeing with Big Tech ... This is the appropriate time to pass a national consumer privacy bill as well as kids-specific legislation to keep minors safe online. We also need to give serious thought to how companies like Facebook and Instagram continue to hide behind Section 230’s liability shield when it comes to content like human trafficking, sex trafficking, drug trafficking – despite Congress speaking clearly to this issue when it passed FOSTA-SESTA a few years ago."
3:09 p.m. ET, December 8, 2021

Sen. Blumenthal: "The trust is gone" from big tech

(U.S. Senate Committee on Commerce, Science, and Transportation)
(U.S. Senate Committee on Commerce, Science, and Transportation)

Clad in a dark suit and appearing in person before Senate lawmakers, Mosseri spent a few moments arranging his notes before Sen. Richard Blumenthal, who chairs the Senate’s consumer protection subcommittee, kicked things off.

Citing a crisis in teen mental health, Blumenthal said companies like Instagram — along with the algorithms that power it — have built “addictive” products “that can exploit children’s insecurities and anxieties.”

Though Mosseri’s prepared testimony calls for the creation of an industry self-regulatory body to come up with best practices for teen social media, Blumenthal signaled he views that as a nonstarter.

“Some of the big tech companies have said ‘Trust us.’ That seems to be what Instagram is saying in your testimony,” Blumenthal said. “But self-policing depends on trust. The trust is gone.”

On Monday, Blumenthal said, his office created a fake Instagram account for a teen and began following accounts promoting eating-disorder content — a follow-up experiment to a similar test he ran two months ago. “Within an hour all of our recommendations promoted pro-anorexia and eating disorder content,” Blumenthal said. “Nothing has changed. It’s all still happening.”

2:54 p.m. ET, December 8, 2021

State attorneys general have launched an investigation into Instagram's impact on kids

From CNN's Clare Duffy

A person walks past a newly unveiled logo for "Meta", the new name for Facebook's parent company, outside Facebook headquarters in Menlo Park on October 28, 2021.
A person walks past a newly unveiled logo for "Meta", the new name for Facebook's parent company, outside Facebook headquarters in Menlo Park on October 28, 2021. (Noah Berger/AFP/Getty Images)

As the head of Instagram testifies on Capitol Hill, there is also a bipartisan investigation that's ongoing.

A group of 10 state attorneys general have launched an investigation into Meta, the social media company formerly known as Facebook, focused on the potential harms of its Instagram platform on children and teens.

The announcement made last month follows extensive reporting on a trove of internal documents leaked by whistleblower Frances Haugen. Some of those documents show that the company's own researchers have found that Instagram can damage young users' mental health and body image, and can exacerbate dangerous behaviors such as eating disorders.

The attorneys general say they will look into whether, by continuing to provide and promote Instagram despite knowing of the potential harms, Meta violated consumer protection laws and "put the public at risk." The states involved include California, Florida, Kentucky and Vermont.

"Facebook, now Meta, has failed to protect young people on its platforms and instead chose to ignore or, in some cases, double down on known manipulations that pose a real threat to physical and mental health — exploiting children in the interest of profit," Massachusetts Attorney General Maura Healey, who is co-leading the investigation, said in a statement. She added that the coalition hopes to "get to the bottom of this company's engagement with young users, identify any unlawful practices, and end these abuses for good."

Meta spokesperson Andy Stone said in a statement that the allegations made by the attorneys general are false and said they "demonstrate a deep misunderstanding of the facts."

"While challenges in protecting young people online impact the entire industry, we've led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders," the statement reads. "We continue to develop parental supervision controls and are exploring ways to provide even more age-appropriate experiences for teens by default."

Read the full story here.

2:20 p.m. ET, December 8, 2021

How Instagram's head plans to defend the platform

From CNN's Brian Fung

Adam Mosseri, Head of Instagram, speaking at an event at the Metropolitan Museum of Art in New York in September.
Adam Mosseri, Head of Instagram, speaking at an event at the Metropolitan Museum of Art in New York in September. (Roy Rochlin/Getty Images)

As he confronts allegations that Instagram knew it was harmful to young people, Adam Mosseri, the head of Instagram, is expected to counter that the platform has long worked to safeguard the wellbeing of teens, according to Meta spokesperson Andy Stone. 

The issue of social media's impact on teens gained renewed attention this fall after Facebook whistleblower Frances Haugen leaked hundreds of internal documents, some of which showed the company knew how Instagram can damage mental health and body image, especially among teenage girls.

Mosseri is expected to cite the platform's new feature encouraging users to log off after a certain period of time, Stone said, as well as new parental controls that will launch in the spring — both of which were highlighted in a product announcement this week ahead of the hearing. 

In his testimony, Mosseri is also expected to highlight how the company shares its data with external researchers and funds outside studies on the app’s impact, Stone said. (In her own testimony before House lawmakers, Haugen said Meta has routinely “gaslit” outside experts who have documented the company’s harms.)

And, in an echo of his boss Mark Zuckerberg, Mosseri will likely also express support for government regulation of social media as it relates to children, Stone added.

2:14 p.m. ET, December 8, 2021

Instagram has been accused of promoting pages glorifying eating disorders to teen accounts

From CNN's Donie O'Sullivan, Clare Duffy and Sarah Jorgensen

(Adobe Stock)
(Adobe Stock)

When Sen. Richard Blumenthal's team registered an account as a 13-year-old girl and proceeded to follow some dieting and pro-eating disorder accounts — the latter of which are supposed to be banned by Instagram — the platform's algorithm began almost exclusively recommending the young teenage account should follow more and more extreme dieting accounts

"I have to be thin," "Eternally starved," "I want to be perfect." These are the names of accounts that the platform promoted to the registered account.

Proof that Instagram is not only failing to crack down on accounts promoting extreme dieting and eating disorders, but actively promotes those accounts, comes as Instagram and its parent company Facebook are facing intense scrutiny over the impact they have on young people's mental health.

Instagram acknowledged to CNN that those accounts broke its rules against the promotion of extreme dieting, and that they shouldn't have been allowed on the platform.

Blumenthal's experiment is not an anomaly, and may come as little surprise to regular uses of Instagram who are familiar with how the platform's algorithm recommends accounts that it has determined a user might be interested in.

As Adam Mosseri, the head of Instagram, testifies on Capitol Hill Wednesday, he's expected to answer questions on reports and accusations such as these.

Read the full story here.

If you or someone you know has an eating disorder, NEDA (in the US) has phone, text, and chat services available on its website and Beat (in the UK) has phone and chat services available on its website.

2:07 p.m. ET, December 8, 2021

Instagram rolled out new features ahead of today's Senate hearing

From CNN's Samantha Murphy Kelly

(Courtesy Instagram)
(Courtesy Instagram)

Just a day before the head of Instagram will face questions from lawmakers over its child safety practices, the company rolled out a handful of new features aimed at making it harder for users, particularly teenagers, to fall down rabbit holes that could be harmful to their mental health.

On Tuesday, the company launched its Take a Break tool, which will encourage users to spend some time away from the platform after they've been scrolling for a certain period. The feature, announced in September, will first come to users in the United States, the United Kingdom, Canada and Australia, and to all users in the months ahead.

Users can turn on the feature in "Settings" and select if they want to be alerted after using the platform for 10 minutes, 20 minutes or 30 minutes. They'll then get a full-screen alert telling them to close out of the app, suggesting they take a deep breath, write something down, check a to-do list or listen to a song.

CNN Business tested the feature ahead of launch; while it's a step in the right direction, there's still room for improvement.

For example, users have to stay on the platform for one continuous session. If the app closes while you run to the bathroom or the screen turns off while you briefly browse Netflix, the timer resets. After the prompt encourages a break, the onus is on the user to resist hitting the big "done" at the bottom of the message to return to the app.

Vaishnavi J, Instagram's head of safety and well-being, said the feature is still in its early stages and will expand its functionality in 2022.

Instagram also said it will take a "stricter approach" to what content it recommends to teenagers and actively nudge them toward different topics if they've been dwelling on something — any type of content — for too long. While the company said it'll share more about the feature soon, a screenshot shared with CNN Business ahead of the announcement revealed that topics such as travel destinations, architecture and nature photography will be used to divert attention. The feature will launch next year.

Read more about the features here.

2:00 p.m. ET, December 8, 2021

The head of Instagram will be grilled soon in the Senate over the platform's child safety practices

Adam Mosseri speaking during the F8 Facebook Developers conference on April 30, 2019 in San Jose, California.
Adam Mosseri speaking during the F8 Facebook Developers conference on April 30, 2019 in San Jose, California. (Justin Sullivan/Getty Images)

Adam Mosseri, the head of Instagram, is testifying at 2:30 p.m. ET before the the Senate's Subcommittee on Consumer Protection, Product Safety, and Data Security.

The hearing is titled "Protecting Kids Online: Instagram and Reforms for Young Users," and Sen. Richard Blumenthal, a Democrat from Connecticut, chairs the subcommittee.

"News reports, whistleblower revelations, and academic research are providing a clearer view of the impact of social media applications on children and teens, especially on their mental health and wellbeing," according to a description of the hearing on the subcommittee website.

"Parents are deeply concerned about the product designs and powerful algorithms that push content to kids and create addiction-like behaviors. This hearing will address what Instagram knows about its impacts on young users, its commitments to reform, and potential legislative solutions," the description continued.

The issue of social media's impact on teens gained renewed attention this fall after Facebook whistleblower Frances Haugen leaked hundreds of internal documents, some of which showed the company knew how Instagram can damage mental health and body image, especially among teenage girls.

Facebook has repeatedly tried to discredit Haugen and said her testimony in Congress and reports on the documents mischaracterize the company's actions. But the outcry from Haugen's disclosures pressured the company to rethink the launch of an Instagram app for children under 13.

CNN's Samantha Murphy Kelly contributed reporting to this post.