A report surfaced on Thursday, June 20, alleging that Instagram often suggests explicit videos to adolescent users. In response, Meta denied the allegations, claiming that the experiment does not reflect the "reality."
Instagram Algorithm Test
Over the course of seven months, researchers from the Wall Street Journal (WSJ) and an academic institution discovered that Instagram Reels almost instantly delivered explicit content to accounts that claimed to be 13 years old.
Teen accounts that showed an interest in explicit videos were then sent to increasingly inappropriate content, including those produced by adult sex-content creators.
According to WSJ, comparable findings were found in internal testing and an investigation that Meta previously carried out.
'Artificial Experiment'
Andy Stone, a representative for Meta, issued a statement claiming that WSJ's study was an "artificial experiment" that does not match the reality of how teens use the social media app.
Stone told The Hill that the company is committed to constantly improving its policies and has teams specifically devoted to making sure teenagers view material that is suitable for their age, even when they first sign up.
He added that Instagram had made significant progress in reducing the amount of mature content teens may view on the platform in the previous several months as part of its ongoing work on concerns about youth app usage.
In January, Meta changed its strategy for teen accounts, concealing anything that was improper for their age and automatically assigning them to the most stringent content control settings.
Join the Conversation