
European regulators are increasing pressure on TikTok to strengthen age checks as concerns grow about underage access to social media platforms across the region.
The company confirmed it will begin rolling out a new age-detection system in Europe in phases over the coming months following roughly a year of internal testing.
TikTok, which is owned by ByteDance, outlined the initiative in briefings reported by Reuters.
The updated system moves beyond user-declared birth dates and instead relies on automated analysis to identify accounts potentially operated by children under 13.
TikTok said the technology assesses profile details, uploaded videos, and behavioural signals to detect patterns commonly linked to younger users.
Accounts flagged by software are not removed automatically and are instead passed to trained human moderators for further review.
Moderators determine whether platform rules have been breached before any enforcement action, including permanent account removal, is applied.
The company said the system aims to improve child protection while limiting unnecessary collection of sensitive personal data.
Regulators across Europe argue that existing age-check measures on social platforms are either too weak or too invasive to be effective.
Policymakers in several EU member states are debating how to balance child safety with privacy rights under existing data protection laws.
Some governments have already taken tougher positions, with Australia announcing a full social media ban for users under 16.
Denmark has proposed restricting access for users aged 15 or younger as part of a broader review of youth online safety.
In the UK, TikTok pilot programmes reportedly led to the removal of thousands of accounts linked to children under 13.
TikTok will introduce a formal appeals process allowing suspended users to verify their age through third-party provider Yoti.
Verification options will include facial age estimation, government-issued identification, or credit card checks during the appeal process.
Similar age-appeal systems are already used by Meta, which owns Facebook and Instagram.
Ireland has become a focal point for enforcement, with media regulator Coimisiún na Meán investigating TikTok under the Digital Services Act.
The regulator is also examining LinkedIn over whether reporting tools and moderation processes meet EU requirements.
In 2025, France fined TikTok €530 million for GDPR violations and issued a separate €310 million penalty against LinkedIn.
Irish authorities are additionally reviewing potential compliance issues linked to Elon Musk and Twitter.
EU law now requires platforms to clearly explain how automated moderation tools work and to demonstrate their accuracy and effectiveness.
Regulators say future compliance will depend on transparency, clear user notifications, accessible appeals, and ongoing national oversight.