Ofcom to Audit Algorithms

Ofcom to Audit Algorithms

31 October 2025

What happened

Ofcom, the UK communications regulator, initiated audits of major tech companies' algorithms under the Online Safety Act. This mandates platforms implement robust age verification and content moderation to prevent minors accessing harmful material. Ofcom will assess the effectiveness of these systems and algorithmic curation, requiring adjustments to limit risks. Non-compliance incurs fines up to 10% of global annual revenue. Firms must also complete 'children's access assessment' and 'children's risk assessment'.

Why it matters

This introduces a significant operational constraint for platform operators, requiring re-evaluation and potential re-engineering of algorithmic content curation and age verification systems. Compliance teams face increased due diligence requirements for demonstrating adherence to new 'children's access' and 'children's risk' assessments. The burden falls on platform engineering and content moderation teams to implement and maintain these controls, with legal and compliance departments managing the heightened financial exposure from potential revenue-based fines.

Source:ft.com

AI generated content may differ from the original.

Published on 31 October 2025

Subscribe for Weekly Updates

Stay ahead with our weekly AI and tech briefings, delivered every Tuesday.

Ofcom to Audit Algorithms