

Meta researchers have found that teens who reported Instagram often made them feel worse about their bodies were exposed to significantly more 'eating disorder adjacent content' than peers who did not, according to an internal study.
The posts frequently displayed chests, waists, or thighs, included explicit judgments about body types, or contained references to disordered eating and negative body image. Although such material is not banned, parents, teens and experts have warned Meta that it may be harmful.
The research, conducted over the 2023–2024 academic year, surveyed 1,149 teens about whether Instagram use left them feeling bad about their bodies. For 223 teens who often reported such dissatisfaction, “eating disorder adjacent content” made up 10.5 per cent of what they saw, compared with just 3.3 per cent for other teens. “Teens who reported frequent body dissatisfaction saw about three times more body-focused/ED-adjacent content than other teens”, the authors wrote.
Researchers also found these teens encountered more provocative content more broadly. Meta classified it under 'mature themes', 'risky behaviour', 'harm & cruelty' and 'suffering. Altogether, such content made up 27 per cent of their feeds, nearly double the 13.6 per cent seen by teens who did not report negative feelings.
The report emphasised that the study did not prove Instagram caused body dissatisfaction, noting teens might actively seek out troubling material. Still, it acknowledged Instagram exposed vulnerable teens to “high doses” of content that Meta’s own advisors support limiting.
Meta spokesperson Andy Stone said the document showed the company’s commitment to understanding and improving its products. “This research is further proof we remain committed to understanding young people’s experiences and using those insights to build safer, more supportive platforms for teens”, he said, pointing to recent steps to align teen content with PG-13 movie standards.
However, the study also revealed shortcomings. Meta’s existing moderation tools failed to detect 98.5 per cent of the “sensitive” content researchers flagged, as those systems were designed to catch rule violations, not borderline material. Researchers said this was unsurprising since work on new detection algorithms had only just begun.
The document marked 'Do not distribute internally or externally without permission', is the latest in a string of internal studies showing links between viewing fashion, fitness, or beauty content and reporting worse body image. Meta has faced state and federal investigations in the US, as well as lawsuits from school districts alleging the company misled the public about Instagram’s safety for youth. Past leaks have also revealed researcher concerns that algorithmic recommendations could harm teens with preexisting body image issues.
The report included examples ranging from images of scantily dressed women to violent fight videos and disturbing illustrations. One drawing showed a crying figure with phrases like “make it all end” scrawled across it; another post depicted a closeup of a lacerated neck. While not banned under Meta’s rules, researchers warned colleagues with a “sensitive content” notice when sharing these findings. - Reuters
Oman Observer is now on the WhatsApp channel. Click here