Friday, April 25

OFCOM TO ENFORCE TOUGH NEW RULES PROTECTING CHILDREN ONLINE – From July 25, social media and internet platforms will be legally required to shield children from harmful online content or face steep penalties

From July 25, social media and internet platforms in the UK will be legally required to shield children from harmful online content—or face steep penalties—under new rules announced by Ofcom.

The country’s communications regulator has outlined over 40 measures as part of the Online Safety Act, mandating stronger protections on platforms frequently used by children, including social media, gaming, and search apps.

Under the new regulations, the highest-risk services—such as major social media platforms—must implement “highly effective” age verification tools to detect under-18 users. Their recommendation algorithms will be expected to filter out harmful content, and all platforms will be required to remove dangerous material quickly and provide children with easy-to-use reporting systems.

Ofcom chief executive Melanie Dawes called the changes a “reset” for children’s online safety and issued a firm warning to non-compliant companies. “These measures will lead to safer social media feeds, limit contact from strangers, and ensure robust age checks for adult content,” Dawes said.

The rules will also compel platforms to suppress harmful content such as abuse, violence, and bullying. Material considered especially damaging—such as content promoting suicide, self-harm, eating disorders, or pornography—must be entirely blocked from children’s feeds.

Companies that fail to comply could face substantial fines or, in severe cases, be taken offline in the UK.

Exit mobile version