COOKIES. CONSENT. COMPLIANCE
secure privacy badge logo
January 3, 2024

The Top 10 Biggest GDPR Fines and Penalties of 2023 [Updated December 2023]

Uncover the biggest GDPR fines of 2023, revealing unprecedented penalties against tech giants like Meta and TikTok. Delve into violations, impacts, and crucial lessons shaping data protection in the digital age.

2023 witnessed a seismic shift in the landscape of data protection enforcement under the General Data Protection Regulation (GDPR). Record-breaking fines, particularly against tech giants, marked a clear message from data protection authorities: data privacy is paramount, and non-compliance will come at a steep price.

Let's delve into the top 10 biggest GDPR fines and penalties of 2023, revealing the companies, violations, and lessons learned for all businesses handling personal data.

Here, we’ll go through the top ten largest GDPR fines of 2023.

  1. Meta Platforms Ireland Ltd. (EUR 1.2 billion)
  2. Meta Platforms Ireland Ltd. (EUR 390 million)
  3. TikTok Ltd. (EUR 345 million)
  4. Criteo (EUR 40 million)
  5. TikTok (GBP 12.7 million)
  6. TIM SpA (EUR 7.6 million)
  7. WhatsApp Ireland Ltd. (EUR 5.5 million)
  8. Clearview AI Inc. (EUR 5.2 million)
  9. Spotify (EUR 4.9 million)
  10. Trygg-Hansa (EUR 3 million)

Let's do a deep dive into each one to see what happened and the impact of their violations.

Meta Platforms Ireland Ltd. (EUR 1.2 billion)

In May 2023, the Irish Data Protection Commission (DPC) issued a record-breaking EUR 1.2 billion fine against Meta Platforms Ireland Ltd., the parent company of Facebook and Instagram. This landmark decision sent shockwaves through the tech industry and ignited a global conversation about data privacy and regulatory enforcement.

Why was Meta fined?

The fine stemmed from two key violations of the General Data Protection Regulation (GDPR):

  1. Transferring EU user data to the US without adequate safeguards: The DPC found that Meta relied on standard contractual clauses (SCCs) to transfer data to the US, despite a ruling from the Court of Justice of the European Union (CJEU) that these clauses were insufficient due to the lack of strong privacy protections in the US. This meant Meta was potentially exposing the personal data of millions of EU users to surveillance by US government agencies.
  2. Processing children's data without valid parental consent: The DPC also found that Meta collected and used personal data of children under 13 on Facebook and Instagram without obtaining valid parental consent. This violated the GDPR's specific requirements for protecting children's privacy online.

Impact of the fine

The EUR 1.2 billion fine was the largest ever imposed under the GDPR, demonstrating the seriousness of Meta's violations. It also served as a strong signal to other tech companies that the EU is taking data privacy enforcement seriously and will not hesitate to impose significant penalties for non-compliance.

Meta Platforms Ireland Ltd. (EUR 390 million)

While Meta's EUR 1.2 billion GDPR fine for data transfers grabbed headlines, another significant penalty, EUR 390 million, landed earlier in the year, focusing on the murky world of online advertising and user consent. This fine, issued by the Irish Data Protection Commission (DPC) in January 2023, exposed serious flaws in Meta's data practices surrounding personalized ads on Facebook and Instagram.

What was the issue?

The DPC found that Meta violated the GDPR in two key ways:

  1. Unlawful reliance on "performance of a contract" as a legal basis for targeted advertising: Meta argued that serving personalized ads was necessary to fulfill the "contract" between them and users. However, the DPC and the European Data Protection Board (EDPB) disagreed, stating that users did not explicitly agree to their data being used for such purposes. This effectively made consent for personalized ads a forced choice in Meta's terms of service, leaving users with the stark option of accepting privacy invasions or abandoning the platform altogether.
  2. Lack of transparency and user control over data used for advertising: The DPC found that Meta's privacy policies and ad settings were too complex and confusing for users to understand how their data was being used for targeted advertising. This lack of transparency made it difficult for users to exercise their right to control how their data was processed.

Impact of the fine

The EUR 390 million fine, alongside the record-breaking EUR 1.2 billion penalty, sent a clear message to Meta and other tech giants: users' data privacy is paramount, and consent must be freely given and truly informed.

The case also highlighted the need for:

  • Simpler and clearer privacy policies and user controls: Users should be able to easily understand and manage how their data is used for advertising purposes.
  • More granular control over data sharing: Users should have the option to choose which types of data are used for targeting and with whom it is shared.
  • Development of alternative advertising models: Businesses need to explore ways to provide targeted advertising without relying on invasive data collection practices.

TikTok Ltd. (EUR 345 million)

In September 2023, the Irish Data Protection Commission (DPC) hit TikTok, the popular short-form video platform, with a hefty EUR 345 million fine, making it one of the largest GDPR fines ever levied against a social media company. This landmark decision raised concerns about the data privacy of children online and sent a strong message to tech giants about the importance of safeguarding youth.

What went wrong?

The DPC's investigation focused on TikTok's handling of children's personal data between July 2020 and December 2020. The key violations identified included:

  • Public-by-default settings: Despite being marketed as a platform for teenagers, TikTok's default settings made users' profiles and videos public, potentially exposing children's data to a wider audience than they may have intended.
  • Ineffective "Family Pairing" feature: Designed to allow parents to monitor their children's activity, the DPC found the "Family Pairing" feature lacked adequate verification measures and could be easily bypassed, rendering it practically useless in protecting children.
  • Inadequate transparency for child users: The DPC determined that TikTok's privacy policies and data practices were too complex and confusing for young users to understand, failing to provide them with meaningful control over their information.

Impact and implications

The EUR 345 million fine served as a significant wake-up call for TikTok and other companies targeting young audiences. It highlighted the need for:

  • Stronger age verification systems: Platforms need robust processes to ensure only users above the minimum age requirement can access their services.
  • Privacy-by-design settings: Default settings should prioritize user privacy, especially for children, with options for increased visibility chosen consciously.
  • Clear and age-appropriate privacy policies: Companies must present information about data collection and use in a way that children can easily understand and make informed decisions.
  • Greater parental control measures: Platforms should provide parents with effective tools to monitor and manage their children's online activity.

Criteo (EUR 40 million)

In June 2023, French data protection authority CNIL delivered a EUR 40 million fine to Criteo, a global adtech giant, for a multitude of GDPR violations related to its targeted advertising practices. This hefty penalty not only sent shockwaves through the adtech industry but also served as a powerful reminder of the importance of user consent, transparency, and data control in the digital age.

Unveiling the Criteo Case

CNIL's investigation uncovered several concerning practices by Criteo:

  • Lack of proper consent: Criteo collected and used personal data for personalized advertising without obtaining valid consent from users. This included failing to provide clear and accessible information about data processing purposes and failing to offer a genuine, free choice for users to opt in or out.
  • Opaque data practices: Criteo's privacy policies and data collection mechanisms were deemed too complex and unclear for users to understand how their data was being used and shared with third-party partners. This lack of transparency hampered users' ability to exercise control over their data.
  • Inadequate control mechanisms: Criteo's platforms lacked user-friendly mechanisms for individuals to easily access, rectify, or erase their personal data. This hindered users' ability to exercise their fundamental GDPR rights.

Impact and Implications

The EUR 40 million fine inflicted upon Criteo resonated throughout the adtech world, highlighting the seriousness of data privacy violations in the realm of targeted advertising. The case underscores the crucial need for:

  • Clear and unambiguous consent mechanisms: Companies must obtain freely given, informed, and specific consent for data processing activities, particularly those involving targeted advertising. This requires providing concise and easily understandable information about how data will be used and offering genuine opt-in and opt-out choices.
  • Increased transparency: Adtech companies must strive for greater transparency in their data practices. This includes clear and accessible privacy policies, transparent data collection methods, and readily available information about third-party data sharing.
  • Empowering user control: Companies must prioritize user control over personal data. This includes providing user-friendly mechanisms for individuals to access, rectify, erase, or restrict the processing of their data and complying with data subject rights requests promptly and effectively.

TikTok (GBP 12.7 million)

In what might seem like a smaller penalty compared to the billion-euro behemoths of 2023, the UK Information Commissioner's Office (ICO) delivered a GBP 12.7 million fine to TikTok in April 2023. This seemingly modest penalty, however, packs a powerful punch, marking a significant victory for children's online privacy and reminding companies that no platform is too big to be held accountable for data protection violations.

Uncovering the issue

The ICO's investigation focused on TikTok's handling of children's personal data between May 2018 and July 2020. Their findings were concerning, revealing significant lapses in child protection measures.

  • Unsecured access for underage users: Despite its terms of service prohibiting users under 13, TikTok failed to implement effective age verification processes. This resulted in an estimated 1.4 million UK children under 13 accessing the platform, exposing them to potentially harmful content and putting their data at risk.
  • Lax data collection and use: TikTok collected the personal data of these underage users, including usernames, birthdays, and video viewing habits, without obtaining valid parental consent. This constituted a clear violation of UK data protection law and GDPR requirements for protecting children's online privacy.
  • Inadequate data security: Concerns were raised about the security of children's data collected by TikTok. The platform lacked sufficient safeguards to prevent unauthorized access, use, or disclosure of this sensitive information.

Impact and Implications

The GBP 12.7 million fine, while less than the record-breaking GDPR penalties, served as a significant wake-up call for TikTok and other social media platforms targeting young audiences. It highlighted the need for:

  • Robust age verification systems: Platforms must implement reliable age verification mechanisms to ensure only users above the minimum age requirement can access their services.
  • Privacy-by-design for children: Default settings should prioritize children's privacy, limiting data collection and exposure to potentially harmful content.
  • Clear and accessible data practices: Companies must present information about data collection and use in a way that children can understand and make informed choices.
  • Enhanced parental control tools: Platforms should provide parents with effective tools to monitor their children's online activity and manage their data privacy.

TIM SpA (EUR 7.6 million)

In September 2023, Italian telecommunications giant TIM SpA found itself at the center of a EUR 7.6 million GDPR fine levied by the Italian Data Protection Authority (Garante per la protezione dei dati personali). While overshadowed by some of the year's larger penalties, this case serves as a potent reminder of the importance of robust cybersecurity measures and the consequences of failing to safeguard user data.

Unraveling the TIM SpA Incident

The Garante's investigation revealed that TIM SpA suffered a series of data breaches between 2015 and 2017, compromising the personal information of millions of customers. Hackers gained unauthorized access to sensitive data, including:

  • Names and addresses
  • Phone numbers
  • Financial information
  • Email addresses
  • Call records

These data breaches not only exposed individual users to potential fraud and identity theft but also eroded trust in TIM SpA's ability to protect its customers' sensitive information.

GDPR Violations and the EUR 7.6 Million Fine

The Garante found that TIM SpA violated several key GDPR principles:

  • Inadequate security measures: The company failed to implement appropriate technical and organizational safeguards to protect personal data from unauthorized access, loss, or damage.
  • Lack of data breach notification: TIM SpA delayed notifying the Garante and affected individuals about the data breaches, further compromising user trust and hindering timely mitigation efforts.
  • Insufficient transparency: The company's privacy policies and information on data security practices were deemed confusing and unclear, failing to provide users with a comprehensive understanding of how their data was being handled.

Impact and Implications

The EUR 7.6 million fine, although not the largest of 2023, served as a significant blow to TIM SpA's reputation and a reminder of the financial consequences of data breaches. The case highlights the crucial need for:

  • Prioritizing cybersecurity: Companies must invest in robust cybersecurity measures, including regular security audits, data encryption, and employee training on data protection practices.
  • Prompt data breach notification: Timely notification of data breaches to the authorities and affected individuals is essential for minimizing harm and complying with GDPR requirements.
  • Enhancing data transparency: Companies must strive for clear and accessible privacy policies, explaining data collection practices, user rights, and security measures in a straightforward manner.

WhatsApp Ireland Ltd. (EUR 5.5 million)

In the shadow of larger GDPR fines in 2023, WhatsApp Ireland Ltd. received a EUR 5.5 million penalty from the Irish Data Protection Commission (DPC) in October 2023 for forcing users to accept its updated terms of service, which expanded data sharing with parent company Meta. While seemingly low compared to other fines, this case resonates with its implications for user consent and transparency in the realm of messaging apps.

Unpacking the Case

WhatsApp, known for its focus on privacy and end-to-end encryption, faced scrutiny over its 2021 update to terms of service. The update broadened data sharing with Meta, raising concerns about users' control over their information. The DPC found that:

  • Lack of genuine choice: WhatsApp presented the updated terms as a non-negotiable condition for continued use, effectively forcing users to consent to the expanded data sharing practices. This violated the GDPR's requirement for freely given, informed consent.
  • Opaque information: The terms of service and privacy policy were deemed too complex and lengthy, failing to provide users with clear and easily understandable information about the implications of accepting the new data sharing provisions.

Impact and Implications

While the EUR 5.5 million fine appears modest, it sends a crucial message about user autonomy and transparency in the face of evolving data practices:

  • Respecting user choice: Companies must present data usage options as genuine choices, allowing users to opt-out without jeopardizing their access to services.
  • Prioritizing clarity: Privacy policies and data practices should be presented in a concise, accessible manner, empowering users to make informed decisions about their data.
  • Building trust through transparency: Fostering trust requires open and honest communication about data collection, usage, and sharing practices.

Clearview AI Inc. (EUR 5.2 million)

While relatively smaller compared to some GDPR giants, the EUR 5.2 million fine imposed on Clearview AI Inc. by the French data protection authority CNIL in October 2023 resonated deeply within the realm of privacy and facial recognition technology. This case exposed fundamental concerns about data scraping, biometric information, and the delicate balance between innovation and ethical data practices.

Unmasking the Clearview Controversy

Clearview AI developed a facial recognition tool that utilizes a massive database of scraped images, primarily sourced from social media platforms without consent. This technology attracted both potential investors and privacy advocates, raising serious concerns.

  • Unlawful data scraping: CNIL found that Clearview violated GDPR by scraping and storing personal data, including biometric information (facial images), without valid consent or legal basis. This practice essentially harvested individuals' identities without their knowledge or control.
  • Lack of transparency and oversight: Clearview operated with limited transparency about its data collection practices and lacked effective mechanisms for individuals to exercise their data subject rights under the GDPR, such as accessing, rectifying, or erasing their facial images.
  • Potential for misuse: The widespread use of such powerful facial recognition technology by law enforcement and other third parties raised concerns about the potential for mass surveillance, discrimination, and privacy violations.

Impact and Implications

The EUR 5.2 million fine, while not the largest of 2023, served as a significant blow to Clearview's credibility and a reminder of the ethical considerations surrounding facial recognition technology.

  • Respecting data privacy: Companies must operate within the legal framework of data protection regulations and obtain valid consent before collecting and processing biometric information like facial images.
  • Prioritizing transparency and accountability: Clear and accessible data practices, coupled with robust mechanisms for user control and oversight, are crucial for building trust and mitigating privacy concerns.
  • Ethical considerations in technology development: Innovation in facial recognition must be coupled with ethical considerations and safeguards to ensure responsible use and prevent potential misuse for surveillance or discrimination.

Spotify (EUR 4.9 million)

In May 2023, the Hamburg data protection authority in Germany levied a EUR 4.9 million fine on Spotify, the popular music streaming platform. While not the largest GDPR penalty of the year, this case struck a chord with users and privacy advocates, raising concerns about data access, transparency, and user control in the digital music world.

What went wrong?

The German authority's investigation focused on Spotify's handling of user data, specifically:

  • Delayed data access requests: Users reported significant delays in accessing their personal data held by Spotify, exceeding the GDPR's one-month timeframe. This hindered their ability to exercise their data subject rights, such as reviewing, correcting, or deleting their information.
  • Lack of transparency: The authority found Spotify's privacy policy to be overly complex and difficult for users to understand. This lack of transparency made it challenging for users to comprehend how their data was being collected, used, and shared.
  • Inadequate data minimization: Spotify was criticized for collecting and storing more personal data than necessary for providing its streaming services. This raised concerns about the potential for misuse and data breaches.

Impact and Implications

The EUR 4.9 million fine, while not a record-breaker, served as a significant reminder for Spotify and other online platforms of the importance of:

  • Prompt data subject rights fulfillment: Companies must ensure timely and efficient responses to user requests for accessing, rectifying, or erasing their personal data.
  • Prioritizing transparency and user control: Clear and concise privacy policies, coupled with user-friendly tools for managing data preferences, are essential for building trust and empowering users.
  • Data minimization and purpose limitation: Companies should only collect and store the minimum amount of personal data necessary for their intended purpose, minimizing the risk of privacy violations and data breaches.

Trygg-Hansa (EUR 3 million)

In a case that resonated through Sweden's insurance industry, Trygg-Hansa, a major insurance company, received a EUR 3 million fine from the Swedish Data Protection Authority (IMY) in August 2023. This penalty, levied for security flaws exposing customers' sensitive information, highlighted the crucial role of data security in building trust and ensuring compliance with GDPR regulations.

Unveiling the Security Lapses

The IMY's investigation focused on Trygg-Hansa's merger with Moderna Försäkringar in April 2022. They discovered significant security vulnerabilities in Moderna Försäkringar's IT systems dating back to November 2020. These vulnerabilities allowed unauthorized access to the personal data of approximately 650,000 customers, including:

  • Names and addresses
  • Phone numbers
  • Email addresses
  • Insurance policy details
  • Financial information

These lapses constituted a serious breach of trust and potentially exposed customers to financial fraud, identity theft,and other privacy violations.

GDPR Violations and the EUR 3 Million Fine

The IMY found Trygg-Hansa negligent in addressing the security flaws, violating several key GDPR principles:

  • Inadequate technical and organizational measures: Trygg-Hansa failed to implement appropriate technical and organizational safeguards to protect personal data from unauthorized access.
  • Failure to prevent unauthorized processing: The security vulnerabilities enabled unauthorized access and potential misuse of customer data.
  • Lack of awareness and reporting: The company failed to properly detect and report the security breach promptly, further exposing customer data to potential harm.

Impact and Implications

The EUR 3 million fine, while not the largest of 2023, served as a wake-up call for Trygg-Hansa and other companies handling sensitive personal data. This case underlined the crucial need for:

  • Prioritizing data security: Companies must invest in robust data security measures, including regular security audits, data encryption, and employee training on data protection practices.
  • Proactive vulnerability management: Regularly identifying and addressing security vulnerabilities in IT systems is essential to preventing unauthorized access and data breaches.
  • Transparency and timely reporting: Companies must be transparent with customers about data breaches and report them promptly to data protection authorities, enabling timely mitigation efforts.

How to avoid GDPR fines and penalties (and build trust with your users)

Navigating the GDPR can be intimidating, but the potential rewards outweigh the risks. By prioritizing user privacy and data security, you not only avoid hefty fines but also build trust with your customers, boosting both your reputation and bottom line.

Secure Privacy can help you achieve GDPR compliance with:

  • GDPR Audit & Gap Analysis: Identify and address vulnerabilities in your current data practices.
  • Privacy Policy Builder & Generator: Create clear, legally sound privacy policies tailored to your unique needs.
  • Consent Management Platform: Simplify consent collection and management, ensuring valid, documented user agreements.
  • Data Subject Rights Management: Streamline and automate responses to user requests for data access,rectification, or erasure.
  • Data Security Solutions: Implement robust security measures to protect data from unauthorized access, loss,and breaches.
  • Employee Training & Awareness Programs: Train your staff on data protection best practices and GDPR requirements.
  • Ongoing Support & Consulting: Get expert guidance and stay updated on the evolving GDPR landscape.

Don't wait for a fine to take action. Take your first step towards GDPR compliance and building lasting trust with your users today. Contact Secure Privacy for a free consultation and discover how we can help you navigate the GDPR complexities with confidence.

Remember: Compliance is an ongoing journey, not a one-time destination. Partner with Secure Privacy and be confident in your commitment to data protection and user privacy.