This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 4 minute read

Joint ICO–Ofcom Statement on Age Assurance: clear expectations for protecting children online

The UK’s Information Commissioner’s Office (“ICO”) and Ofcom have today published a joint statement on age assurance (“Statement”). It marks a significant step in their collaborative efforts to protect children online and signals a stronger phase of monitoring and regulatory oversight. 

The Statement is aimed at services that are likely to be accessed by children and fall in scope of the Online Safety Act (“OSA”) and UK data protection legislation. It offers clear, practical guidance on how digital service providers can meet their obligations under both legislative frameworks; detailing how platforms should approach age assurance in a way that protects children while respecting their privacy. To support implementation, the Statement includes hypothetical examples of processes that meet both online safety and data protection requirements in practice.

Why Age Assurance Matters

Children are exposed to growing risks online, from exposure to inappropriate content to misuse of their data. In response, Ofcom and the ICO emphasise that robust age assurance is essential. Where a service sets a minimum age limit, that limit must be enforced using methods that are effective, proportionate, and compliant with data protection law.

The Statement responds directly to concerns about children accessing digital platforms and services that are not designed for them. It follows the UK Government’s landmark consultation launched earlier this month on major measures to protect children using social media, gaming platforms and AI chatbots. The consultation, which closes on 26 May 2026, seeks views on:

  • whether there should be a minimum age for social media, and if so, what age would be appropriate; 

  • how age verification enforcement should be strengthened; and 

  • how to help children and parents navigate the digital world and thrive online.

The Statement also builds on recent coordinated announcements from the ICO and Ofcom. On 12 March 2026, Ofcom set out a clear demand for further action, aiming to hold tech firms publicly accountable for creating a safe online environment for children. It challenged major sites and apps to prove their genuine commitment to protecting children online by enforcing minimum age rules with highly effective age checks. 

At the same time, the ICO issued an open letter to social media and video-sharing platforms operating in the UK, urging them to strengthen their age assurance measures. In its letter, the ICO called on platforms to: 

  • stop relying on self‑declaration as the primary age check; considering it ineffective and easily bypassed; 

  • urgently review and strengthen age‑assurance measures to stop users under the age of 13 from accessing their services; and

  • implement effective age gates to enforce the platform’s own minimum age requirement.

The ICO recommends that platforms make use of viable technology that is now readily available to enforce minimum age requirements and prevent children from accessing their services. Examples include facial age estimation, digital ID and one-time photo matching.

Reflecting this tougher stance, the ICO has stepped up its related enforcement activity. It recently fined Reddit £14.47 million and MediaLab (owner of Imgur) £247,590 for failing to implement age‑assurance measures and for processing children’s personal information unlawfully in ways that potentially exposed children to inappropriate, harmful content. 

Key Guidance for Digital Services

The Statement reiterates the regulators’ position that self-declaration is not an appropriate age assurance method. However, it maintains that its recommended approach is risk-based, flexible, tech-neutral and future-proofed, allowing service providers discretion to choose the most appropriate age assurance methods. 

The Statement outlines the main areas of interaction between online safety and data protection as they relate to age assurance, and sets out several core expectations for online services:

  • Assess the Risks: Service providers must evaluate the risks their platforms pose to children and determine if age assurance is necessary.

  • Choose the Right Methods: Where a service falls in scope of the OSA age assurance duties, age assurance techniques must be highly effective at determining whether a user is a child, and proportionate to the risk, avoiding over-collection of personal data.

  • Comply with Data Protection: All age assurance methods must comply with UK data protection laws, safeguarding children’s privacy.

Where the OSA requires age checks - particularly for usertouser services that permit “primary priority” content (i.e. pornography, selfharm, suicide and eating disorders) and service providers that publish or display their own pornographic content - platforms must deploy “highly effective age assurance” (HEAA) to prevent children from encountering such material. A HEAA process should meet four criteria: technical accuracy, robustness, reliability and fairness, whilst still having regard to accessibility and interoperability in order to ensure a service remains easy to use and works for all users.

From a data protection perspective, the Statement underlines that where a service provider applies a minimum age of 13, it is unlikely to have a lawful basis for processing the personal data of children under that age. A service provider’s focus should accordingly be on preventing under-13s from accessing their service by implementing an effective age gate. 

Whatever age assurance method is chosen, service providers must ensure it is necessary, proportionate to the risks involved, and compliant with data protection principles, including data minimisation, storage limitation, and purpose limitation. They must also be transparent about how age-related data is used, provide mechanisms for users to challenge inaccurate age assurance decisions, and conduct data protection impact assessments where processing is likely to present a high risk to children. To help demonstrate compliance, service providers are encouraged to consider data protection certification schemes such as the ICO’s Age Check Certification Scheme (ACCS).

Conclusion

The message from Ofcom and the ICO is clear: protecting children online requires strong, effective, and privacy-respecting age assurance. Digital service providers must act now to ensure children are kept safe, and their data is protected, reinforcing the UK’s commitment to a safer online environment for young people.

Ofcom and ICO are working closely together on our shared goal of protecting children from harm online.

Tags

data protection, cyber, london, technology, commercial data & tech