Instagram’s Adam Mosseri defends the app’s teen safety track record to Congress

OSTN Staff

Head of Instagram Adam Mosseri testified before Congress for the first time Wednesday, defending the app’s impacts on teens and its aspirations to bring younger children formally into the fold.

In September, leaked documents from Facebook whistleblower Frances Haugen painted a picture of a company that knows it takes a toll on the mental health of some of its most vulnerable users.

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” researchers said in an internal presentation, reported by The Wall Street Journal. Internal research also found that within a group of teen Instagram users who said they experienced suicidal thoughts, 13% of British users and 6% of American users connected their desire to commit suicide to Instagram.

The company now known as Meta conducted that research internally and it was only brought to light through a revelatory series of reports in The Wall Street Journal. The leaked documents came up repeatedly in Wednesday’s hearing, with lawmakers on the Senate’s consumer protection subcommittee citing the revelations and pressing unsuccessfully for access to more of Instagram’s internal findings on its impacts on kids and teens.

Disturbing findings

In his opening statement, subcommittee chair Richard Blumenthal (D-CT) said that just prior to the hearing, his staff again made a test to explore dangerous content on Instagram and easily found algorithmic recommendations serving dangerous content. “…Within an hour, all of our recommendations promoted pro anorexia and eating disorder content.”

Ranking member Marsha Blackburn’s own office also made a test account for a teen and noted that it defaulted to “public” rather than private, as Instagram says that accounts for users under 16 do. Mosseri admitted that Instagram failed to enable the safety step for accounts create on the web.

“This is now the fourth time in the past two years that we have spoken with someone from Meta and I feel like the conversation repeats itself ad nauseam,” Blackburn (R-TN) said during opening statements. “Nothing changes — nothing.”

In the hearing, Mosseri followed Meta’s approach to recent damning reporting, dismissing some of the findings outright — even intuitive ones. In response to a question on concerns about Instagram’s addictive nature — a phenomenon that most Instagram users could attest to — Mosseri asserted that “Respectfully, I don’t believe that research suggests that our products are addictive.”

Prior to Mosseri’s testimony, Facebook’s Global Head of Safety Antigone Davis appeared before the Senate subcommittee to address teen safety concerns. “We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Davis argued, defending the company’s efforts.

Meta doubled down, defending its practices in light of the reports and Haugen’s subsequent testimony to U.S. lawmakers. The company argued that the precautions it takes on Instagram are adequate and that the research was taken out of context. “It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls,” Facebook Head of Research Pratiti Raychoudhury wrote in a blog post slamming The Wall Street Journal’s reporting.

In late September, facing the firestorm of criticism, Mosseri announced that Instagram would pause its plans to develop Instagram Kids, a version of the app specifically for children under 13. The company faces ongoing criticism from the mental health community and lawmakers in the U.S. and abroad who believe that Instagram is not a responsible custodian for the well-being of children and teens.

In the hearing, Mosseri repeated the company’s argument that kids are already using the platform in spite of its age requirements and building a kid-specific app would create a layer of safety that doesn’t currently exist. “We know that 10- to 12-year-olds are online… we know that they want to be on platforms like Instagram,” Mosseri said. “And Instagram quite frankly wasn’t designed for them.”

Meta still wants to regulate itself

Mosseri also used the hearing to propose a new “industry body” that would create tech-wise best practices on issues like age verification, parental controls and product design for kids and teens. Mosseri also took the notable step of stating that Instagram would be willing to follow rules from this theoretical pseudo-regulatory agency in order to “earn some of our Section 230 protections.”

Blumenthal slammed Mosseri’s proposal for self regulation, pressing the Instagram head on what enforcement would look like in that scenario. Mosseri wasn’t eager to agree with Blumenthal’s suggestion that the U.S. Attorney General should be able to oversee enforcement if tech companies failed to meet their own standards. “Self policing based on trust is no longer a viable solution,” Blumenthal said, concluding the hearing.

Policy leads from YouTube, Snap and TikTok testified before Congress in October on the same issues, largely spending their time contrasting their own policies on kids and teens to those of rival Facebook. “Being different from Facebook is not a defense,” Blumenthal said during that hearing. “That bar is in the gutter. It’s not a defense to say that you are different.”

Last month, Instagram began testing an opt-in feature called “Take a Break” that would remind users to take up to a 30 minute break from the famously addictive app, if enabled. A day prior to Mosseri’s testimony, that feature launched alongside the announcement that Instagram would roll out its first set of parental controls in March of 2022. Those controls will allow parents to monitor and limit time spent on the app but fall short of the powerful controls offered by rivals like TikTok.

Powered by WPeMatico

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.