A former senior staff member at Meta says Instagram is not doing enough to protect teens from sexual harassment.
Arturo Bejar, who is testifying in front of the US Congress on Tuesday, said he thinks whistleblowing means he will not work in the industry again.
He worked for Meta, which owns Facebook and Instagram, between 2009 and 2015, and again from 2019 to 2021.
Meta said it had brought in "over 30 tools" to support a safe environment for teens online.
Mr Bejar said it was his daughter's experience of Instagram that first made him think there was a problem.
Speaking to Zoe Kleinman, the BBC's technology editor, he said "shortly after she went on Instagram, she started getting unwanted sexual advances – misogyny, harassment at 14."
"When we would talk about this… it turns out that all of her friends were experiencing the same.
"I was shocked… she said there was nothing [she] could do, because [she] had no option to report it."
Image source, Getty Images
Mr Bejar hopes that discussing his experience and that of his daughter in Congress will help give lawmakers the information they need to take action.
"We're in a very extraordinary time where there's consensus across the political spectrum about the urgency and necessity of passing legislation that protects our kids, all of our kids," he said.
He said it would be "easy" for Meta to implement a button specifically to let teens flag messages as sexual advances.
'No transparency'
"I can speak first hand about how easy it is to build a button and a counter," he said.
"I believe that the reason that they're not doing this is because there's no transparency about the harms that teenagers are experiencing on Instagram.
"And that's why I'm coming forward right now… this is my retirement from technology."
Currently, people can report Instagram messages for a range of reasons, including for containing "sexual exploitation or solicitation".
A Meta spokesperson told the BBC it has created several features to protect teens online, such as implementing anonymous notifications of potentially hurtful content.
"Every day countless people inside and outside of Meta are working on how to help keep young people safe online," they said.
"Working with parents and experts, we have also introduced over 30 tools to support teens and their families in having safe, positive experiences online."
In 2021, Instagram introduced measures including making under-16 user accounts private by default and only letting older users message teens who followed them.
'The least that we can do'
Mr Bejar, who was director of engineering at Facebook and responsible for its "protect and care team", said tools implemented by Instagram didn't go far enough, and were instead a "placebo for press and regulators".
"They're not based on the data of what people are experiencing," he said.
"What you would expect to be able to ask them on this is, what percentage of teens experienced unwanted sexual advances?
"If you go into [Instagram] messages, I could not find any option that says: this is an unwanted advance."
Image source, Getty Images
According to the whistleblower, building a button that teens feel comfortable pressing is "the least that we can do", because he claims the "report" button on Instagram may be underused.
"Research we did in 2011 shows that 13-year-olds are uncomfortable with the word report, because they worry that they will get themselves or somebody else in trouble," he said.
"Imagine you're a 13-year-old, and you get an unwanted sexual advance – how uncomfortable that is, how intense that experience is, and there's nothing that they can use to say: 'can you please help me with this?'
"If that button was available, then there would be data about who's initiating those contacts."
Meeting with Mosseri
Mr Bejar said he gathered information about this and went to "the top people" at Meta, including Instagram head Adam Mosseri, to discuss his concerns in 2021.
"I came out of that meeting feeling like Adam completely understood the issue, to the point where we talked about how you would design that button," he said.
"But I was not sure whether they were going to act on it."
He claimed that internal statistics showed one-in-eight 13 to 15-year-olds had experienced an unwanted sexual advance on Instagram within a week.
The BBC has seen documents which show Mr Bejar flagged this statistic to Mr Mosseri.
"I deeply felt that they had a responsibility now," he said.
"I asked Adam in an email… what should be an acceptable number or percentage of 13 to 15-year-olds who receive an unwanted sexual advance?
"Social media should not be a place where a kid receives those kinds of things."
Hundreds of lawsuits
Image source, Getty Images
Meta, and other social media companies, are facing lawsuits in the US over the impact of social media platforms on teen mental health.
In October, dozens of US states filed a lawsuit arguing Meta had misled the public over risks of social media use and had contributed to a youth mental health crisis.
At the time, a spokesperson for Meta said: "We're disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path."
It followed an investigation in 2021 by several state prosecutors, after whistleblower Frances Haugen testified in the US that Meta knew its products could harm children.
Ms Haugen's testimony followed a 2021 Wall Street Journal report leaking internal studies from the firm which it said showed teenagers blamed Instagram for increased levels of anxiety and depression.
Instagram published a lengthy blog defending its research in response to the report, and said it focused "on a limited set of findings and casts them in a negative light".
"The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced," Pratiti Raychoudhury, vice president and head of research at Meta said at the time.