Ex-Employee Accuses Meta of Giving False Information to Public Regarding Instagram Safety

Ex-Employee Accuses Meta of Giving False Information to Public Regarding Instagram Safety

A former executive at Meta, the parent company of social media platform Instagram, has accused the company of misleading the public about the safety of its app for teenagers. Arturo Bejar, who left Meta in 2021, testified before US senators this month, stating that Instagram is not suitable for children as young as 13 and that the company failed to address his concerns. Mr. Bejar’s testimony adds to the scrutiny faced by the tech giant regarding its impact on teenagers. He revealed that his teenage daughter and her friends have received unwanted sexual advances on Instagram for several years. He also criticized the design of the app, stating that it discourages teenagers from reporting uncomfortable experiences.

Mr. Bejar, who has been working in the technology industry since he was 15, joined Facebook, now known as Meta, in 2009. He initially worked as an engineering director on the Protect and Care team before leaving in 2015 to spend more time with his children. However, in 2019, he became concerned about his 14-year-old daughter receiving unwanted sexual advances on Instagram, prompting him to return to Meta as a consultant focused on safety technology.

During Mr. Bejar’s second stint at the company, damaging leaks about Meta began to emerge. Data scientist Frances Haugen publicly accused Meta of knowing that Instagram negatively affected the self-esteem of teenage girls and provided evidence to US senators. Meta vehemently denied these claims. While Ms. Haugen was going public, Mr. Bejar privately raised his concerns with Sheryl Sandberg, Facebook’s operations chief, and Adam Mosseri, the head of Instagram. He also sent a detailed research report to Meta CEO Mark Zuckerberg, highlighting the harm teenagers were experiencing on Instagram. However, his concerns were allegedly disregarded, and Mr. Zuckerberg never responded.

Mr. Bejar emphasized that his research provided statistically significant evidence of millions of teenagers facing safety issues on Meta’s apps, but the company ignored it. In response, a spokesperson for Meta called it “absurd” to suggest a conflict between their study of users’ perception of Instagram and their transparency reports. The spokesperson stated that they take action based on both metrics and continue to work on improving safety measures.

However, Mr. Bejar argued that social media companies should be obligated to collect and disclose data on the number of children facing unwanted sexual advances on their platforms. While he acknowledged the importance of encrypted messages, he expressed uncertainty about their suitability for children.

When asked if he believed children’s safety was a top priority for Meta, Mr. Bejar pointed out that many researchers on the Instagram wellbeing team had been laid off since his departure, indicating their priorities.

It remains to be seen how Meta will respond to these allegations and whether further action will be taken to address the safety concerns surrounding Instagram for teenagers.