FILE PHOTO: Meta and Facebook logos are seen in this illustration taken February 15, 2022. REUTERS/Dado Ruvic/Illustration/File Photo
"It's time that the public and parents understand the true level of harm posed by these 'products' and it's time that young users have the tools to report and suppress online abuse," he said in written remarks made available before the hearing. Meta said in a statement that it is committed to protecting young people online, pointing to its backing of the same user surveys Bejar cited in his testimony and its creation of tools like anonymous notifications of potentially hurtful content.
In one 2021 email, Bejar flagged to Zuckerberg and other top executives internal data revealing that 51 per cent of Instagram users had reported having a bad or harmful experience on the platform in the past seven days and that 24.4 per cent of children aged 13-15 had reported receiving unwanted sexual advances.