Wednesday, August 20

SOCIAL MEDIA STILL FLOODING TEENS WITH SUICIDE CONTENT – This is despite new online safety laws

There are alarming revelations about what teenagers are still being exposed to online. 

A new investigation has found that social media platforms like Instagram and TikTok are continuing to bombard young users with content linked to depression, self-harm, and even suicide—despite new online safety laws meant to protect children.

The Molly Rose Foundation, a charity set up in memory of 14-year-old Molly Russell, opened dummy accounts posing as a 15-year-old girl.  Within hours of interacting with posts on suicide and depression, the account was hit with what the foundation describes as a “tsunami of harmful content.”

The numbers are stark: 97% of the videos recommended on Instagram Reels and 96% on TikTok’s For You Page were deemed harmful.  More than half of TikTok’s harmful recommendations directly referenced suicide and self-harm, with 16% even detailing specific methods—some previously unknown to researchers.

Andy Burrows, Chief Executive of the Molly Rose Foundation, says these algorithms are operating at an “industrial scale,” continuing to push teenagers toward shocking levels of harmful content. 

While companies have made it harder to search for certain hashtags, the report says personalised AI recommender systems still amplify damaging material once it’s been engaged with.

Exit mobile version