UK DPA Fines TikTok €14.5 Million for Failing to Protect Children's Data
The UK's ICO fines TikTok €14.5 million for breaching GDPR rules on safeguarding children's data. Learn about the violations, implications, and lessons from this case.
The UK's data protection watchdog, the Information Commissioner's Office (ICO), has fined TikTok EUR 14.5 million (GBP 12.7 million) for failing to comply with data protection principles under the General Data Protection Regulation (GDPR).
The GDPR is a comprehensive set of data protection laws that applies to all organizations that process the personal data of individuals in the European Union (EU). The GDPR sets out a number of data processing principles that organizations must comply with, including the principle of lawful, fair, and transparent processing. This principle requires organizations to obtain consent from individuals before collecting and processing their personal data, unless there is another legal basis for processing the data.
The GDPR also sets out specific requirements for organizations that process the personal data of children under the age of 13. These requirements include obtaining parental consent before collecting and processing children's data, and providing clear and transparent information to children about how their data is being collected and used.
What was the violation?
In the case of TikTok, the ICO found that the company had allowed children under the age of 13 to create accounts on its platform, in violation of the GDPR's requirement that organizations obtain parental consent before collecting and processing the personal data of children under 13.
The ICO also found that TikTok had not taken adequate measures to prevent children from accessing its platform, and that it had not provided clear and transparent information to children about how their data was being collected and used.
What was the decision?
Based on the violations, the ICO fined TikTok EUR 14.5 million, or GBP 12.7 million. The final fine imposed on TikTok is significantly lower than the original fine of GBP 27 million, since the ICO did not pursue the provisional finding on unlawful use of special category data. This fine is a reminder to all organizations that they must take data protection seriously, especially when it comes to children's data.
How could the fine have been avoided?
There are a number of steps that TikTok could have taken to avoid the fine. These include:
- Implementing stricter age verification measures to prevent children under the age of 13 from creating accounts.
- Providing more clear and transparent information to children about how their data is being collected and used.
- Obtaining parental consent before collecting and processing the personal data of children under the age of 13.
By taking these steps, TikTok could have ensured that it was complying with the GDPR and avoiding the fine.
What are the implications of the fine?
The fine imposed on TikTok is significant for a number of reasons. First, it is the third-largest fine ever imposed by the ICO under the GDPR. Second, it is a warning to other tech companies that they must take data protection seriously, especially when it comes to children's data. Third, the fine could damage TikTok's reputation and make it more difficult for the company to attract users and advertisers.
The fine is also a reminder that the GDPR is a powerful tool that can be used to protect the privacy of individuals. Organizations that fail to comply with the GDPR can face significant fines, and they can also damage their reputation and lose the trust of their users.
What can we learn from this case?
There are a number of lessons that we can learn from this case. First, it is important for organizations to have a clear understanding of the GDPR and the requirements that it imposes. Second, organizations need to take steps to ensure that they are compliant with the GDPR, especially when it comes to children's data. Third, organizations should be prepared to face the consequences if they fail to comply with the GDPR.
The GDPR is a complex piece of legislation, but it is important for organizations to understand and comply with it. By doing so, they can protect the privacy of individuals and avoid the risk of significant fines.
Get Started For Free with the
#1 Cookie Consent Platform.
No credit card required

Data Broker Registration Explained (2026): How to Register Under U.S. Privacy Laws
Data brokers occupy a peculiar position in the privacy landscape: they are often the most consequential handlers of personal information that consumers have never heard of. A person may carefully manage what they share with their bank, their employer, and the apps on their phone — and still find their name, home address, income range, health interests, and browsing behavior for sale across hundreds of databases they never interacted with.
- Legal & News
- Data Protection

EU AI Act Implementation Sprint: A 90-Day Playbook for Enterprise Compliance
The EU AI Act is no longer a regulation on the horizon. Prohibited AI practices have been enforceable since February 2025. General-purpose AI obligations have applied since August 2025. And on 2 August 2026 — five months from now — the full weight of high-risk AI system requirements under Annex III comes into force, bringing with it a penalty structure that exceeds even the GDPR: up to €35 million or 7% of global annual turnover for the most serious violations, and up to €15 million or 3% for non-compliance with high-risk obligations.
- AI Governance

React Native Consent SDK: Implement Mobile Consent Management
Adding a consent banner to a React Native app is straightforward. Implementing consent management that actually controls data collection — where no third-party SDK fires a network request before the user has responded, where consent state persists correctly across sessions, and where every decision is logged for regulatory audit — is a different engineering problem.
- Mobile Consent