COOKIES. CONSENT. COMPLIANCE
secure privacy badge logo
April 9, 2026

Data Residency Requirements: EU vs US Explained

Your SaaS platform serves users in Germany, France, and California. Your infrastructure runs on AWS us-east-1. Your analytics vendor is headquartered in San Francisco. Your customer support tool uses a helpdesk provider with data centers in Virginia. Each of these arrangements involves the transfer or storage of personal data in ways that intersect with two fundamentally different regulatory philosophies — and the cost of misunderstanding those differences is climbing. Meta's €1.2 billion fine for unlawful EU-US data transfers remains the largest single GDPR penalty on record. TikTok absorbed €530 million in 2025 for failing to protect EEA user data from unauthorized access in China. Cumulative GDPR fines have now passed €7.1 billion.

Data residency — deciding where data lives, how it moves, and what legal framework governs it at every stage — is no longer an infrastructure footnote. It is a design constraint that shapes cloud architecture, vendor contracts, product roadmaps, and enforcement exposure simultaneously.

TL;DR

  • The EU does not require personal data to be stored within EU borders, but it strictly governs how data can leave the EEA. Every transfer to a non-adequate third country requires a legal mechanism — adequacy decision, Standard Contractual Clauses, or Binding Corporate Rules — backed by a Transfer Impact Assessment where adequacy is absent.
  • The US has no federal data residency law. The patchwork of state privacy laws focuses on consumer rights and data practices rather than where data is physically stored.
  • The EU-US Data Privacy Framework survived its first legal challenge in September 2025, but remains under appeal and subject to political risks. Organizations relying solely on the DPF should maintain SCC backup mechanisms.

Image

Prioritizing user privacy is essential. Secure Privacy's free Privacy by Design Checklist helps you integrate privacy considerations into your development and data management processes.

DOWNLOAD YOUR PRIVACY BY DESIGN CHECKLIST

What Data Residency Actually Means

Data residency refers to the physical or geographic location where data is stored and processed. Data localization is a related but stricter concept: a legal requirement that data must remain within a specific territory and cannot be transferred outside it. China's Personal Information Protection Law mandates localization for certain data categories. Russia imposes localization requirements for personal data of Russian citizens. These are genuine localization mandates.

GDPR is frequently described as a data residency law, but this is technically imprecise. GDPR does not prohibit storing EU personal data outside the EU. What it prohibits is transferring personal data to a third country — any country outside the European Economic Area — unless that transfer is grounded in one of the legal mechanisms Chapter V of the Regulation provides. The distinction matters practically: a European company can legitimately store data on servers in the United States, provided that transfer is lawful. The legal mechanism, not the server location, is the compliance question.

This distinction shapes how engineers, product managers, and legal teams should think about EU compliance. The question is not "do we have an EU data center?" — though some use cases make EU storage the most practical path to compliance. The question is "what is the legal basis for every transfer of EU personal data outside the EEA, and can we demonstrate it?"

The EU Framework: Transfer Rules, Not Localization Rules

GDPR Chapter V (Articles 44 to 50) governs international data transfers. The core principle is that the level of protection GDPR affords EU residents must travel with the data — personal data does not lose its GDPR protections simply by moving to a server outside the EEA. Every transfer requires either an adequacy decision confirming that the receiving country's legal framework provides essentially equivalent protection, or an appropriate safeguard that the controller or processor puts in place independently.

Adequacy decisions are the simplest path. The European Commission has issued adequacy decisions for thirteen jurisdictions, including the UK (renewed until December 2031 with ongoing monitoring conditions), Canada (commercial organizations), Japan, Switzerland, and the United States under the EU-US Data Privacy Framework. When sending data to an adequacy-covered jurisdiction, no additional transfer mechanism is required beyond the standard GDPR obligations.

The EU-US Data Privacy Framework, which replaced the invalidated Privacy Shield in July 2023, survived its first legal challenge when the EU General Court dismissed the Latombe case on September 3, 2025. The court confirmed that the US provided an adequate level of protection as of the time of the Commission's decision. However, the ruling was explicitly limited to the facts as they stood in July 2023. An appeal has been filed at the CJEU and is working through the courts. Separately, political developments in the US — including changes to the Privacy and Civil Liberties Oversight Board, which was a key DPF oversight mechanism — have prompted several EU data protection authorities to urge businesses to prepare contingency plans. Norwegian, Danish, and Swedish regulators have explicitly recommended developing "exit strategies" for US-dependent transfers that could function immediately if the DPF were invalidated. Understanding the full picture of cross-border data transfer mechanisms and how they interact with consent obligations is the starting point for any organization building on international infrastructure.

Where no adequacy decision exists, organizations must use an approved safeguard. Standard Contractual Clauses (SCCs) are the most widely used mechanism. These are model contracts adopted by the European Commission that contractually bind the data importer to protect personal data to GDPR standards. Updated SCCs from Q2 2025 simplified certain insertion procedures while enhancing clarity on transfer scenarios. When SCCs are used for transfers to countries without adequacy, a Transfer Impact Assessment (TIA) is also required — an analysis of whether the laws of the receiving country would actually allow the SCCs to be honored, or whether government access powers or surveillance laws would undermine the protection the SCCs purport to provide.

This is where the TikTok case is instructive. The Irish DPC fined TikTok €530 million specifically because TikTok had conducted multiple TIAs regarding transfers to China but had failed to adequately assess whether Chinese law and practices would protect the transferred data. Having a TIA on file was not sufficient; the TIA had to actually reach the right conclusion and feed into the selection of appropriate supplementary measures. Conducting a Transfer Impact Assessment is not a formality — it requires a genuine assessment of the receiving country's legal environment and a credible conclusion about whether protection is essentially equivalent to EU standards.

Binding Corporate Rules are the third primary mechanism, available to multinational organizations that want to govern intra-group transfers through a single approved framework rather than individual SCCs between every group entity. BCRs require approval from a lead data protection authority and are more resource-intensive to establish, but provide operational simplicity for large enterprise groups with complex internal data flows.

The US Landscape: No Federal Residency Law, No Federal Privacy Law

The contrast with the EU framework is stark. The United States has no federal data residency law. There is no equivalent to GDPR's Chapter V governing where data can be sent or stored. A California company that transfers user data to servers in Singapore, Australia, or anywhere else faces no federal-level restriction based on the data's geographic movement.

The US approach is sectoral rather than comprehensive. Certain categories of sensitive data attract specific residency or access restrictions. Healthcare data governed by HIPAA does not carry explicit residency requirements, but Business Associate Agreements must ensure that covered entities and their partners protect PHI to HIPAA standards wherever it is stored. Financial data governed by the Gramm-Leach-Bliley Act similarly imposes security obligations without geographic constraints. The US Department of Justice's Data Security Program, which came into full effect in October 2025, imposes restrictions on the transfer of "bulk sensitive personal data" and US government-related data to entities associated with countries of concern — China, Russia, Iran, North Korea, Cuba, and Venezuela — with penalties reaching $368,136 per violation. This is a national security restriction, not a consumer privacy framework, but it creates direct data localization implications for organizations handling data in affected categories.

The US state privacy law landscape — now spanning more than 20 states with comprehensive consumer privacy laws — is overwhelmingly focused on consumer rights and data practices rather than where data is physically stored. California's CCPA/CPRA, Virginia's CDPA, Colorado's CPA, and their counterparts require transparency in data collection, consumer access and deletion rights, opt-out mechanisms for data sales, and data processing assessments. None of these laws mandate that consumer data remain within state borders or restrict transfers to other jurisdictions. The enforcement priorities that US state regulators are acting on in 2026 — opt-out mechanism functionality, consent banner compliance, vendor contract adequacy — have nothing to do with geographic data residency.

The practical consequence of this divergence is significant. A US-based SaaS company building primarily for US customers faces no cross-border transfer obligations in any direction from a US regulatory standpoint. That same company, the moment it acquires customers in Germany or France, inherits GDPR's transfer framework for all the data those EU residents generate — regardless of where the company's servers are located or what US law says about the matter.

Technical Implications: How Transfer Rules Shape Infrastructure Decisions

Understanding the legal framework is the prerequisite; designing systems around it is the operational challenge. For engineering teams, data residency compliance manifests in several specific architectural patterns.

The most direct response to GDPR's transfer framework is EU-region data storage. Major cloud providers — AWS, Google Cloud, Azure — offer EU-region deployments that keep data physically within the EEA, eliminating the transfer question entirely for data that never leaves EU infrastructure. This is not required by GDPR, but it resolves transfer compliance elegantly and is increasingly favored by enterprise European customers who want contractual certainty that their data remains in EU jurisdiction. The tradeoff is operational: EU-region infrastructure is typically more expensive than US-region alternatives, latency characteristics differ, and some cloud services are only available in certain regions.

For organizations that cannot or choose not to route EU data exclusively through EU infrastructure, tenant-level data segregation is the alternative pattern. EU users' personal data is stored and processed in a logically and technically isolated environment — separate databases, separate processing pipelines — from non-EU users' data. Transfer controls can then be applied specifically to the EU data flows while US users' data operates under US-region infrastructure without the same constraints. This pattern is common in SaaS platforms that serve both US enterprise and EU enterprise customers and want to optimize infrastructure costs while maintaining demonstrable EU compliance.

Analytics and AI training are two specific scenarios where the transfer issue is most frequently overlooked. When EU user behavioral data — session events, clickstreams, interaction logs — is routed to a US-based analytics vendor's data collection endpoint, that routing is a transfer. Many analytics integrations initialize before consent is obtained (the SDK pre-consent problem discussed separately), creating a compounding violation: both an unauthorized transfer and a lack of lawful basis for the underlying processing. Analytics vendors that offer EU-region data storage and EU-region processing eliminate the transfer dimension, though organizations still need valid consent and disclosure.

AI training presents a specific complexity: if personal data of EU residents is used to train a model on infrastructure located outside the EEA, the training run itself is a transfer. The legal mechanism — typically SCCs with the infrastructure provider — must cover that specific processing activity. The EDPB has noted that organizations deploying third-party LLMs must verify the lawful acquisition of training data, and transfer obligations attach to the data flows that produced the training dataset as well as any inference-time processing of EU users' data.

Practical Compliance Scenarios

For a SaaS company with global users, the first task is a data flow map that distinguishes EU users' data from non-EU users' data at every point in the processing chain — collection, storage, analytics, support tools, third-party integrations, backups, and AI processing. Each flow involving EU personal data leaving the EEA needs a documented legal mechanism. The DPF covers transfers to self-certified US organizations for activities within the DPF's scope, but organizations should maintain executed SCCs as backup and should assess any supplementary measures appropriate for the data categories involved.

For an analytics platform transferring EU event data to US servers, the practical options are: use an EU-region data endpoint and keep processing within the EEA; execute SCCs with the analytics vendor and complete a TIA confirming that US law does not prevent the vendor from honoring those clauses; or route EU analytics through a privacy-preserving aggregation layer that produces non-personal aggregate data before it leaves the EEA.

For a customer support platform that processes EU users' conversation data on US infrastructure, the support vendor needs to be covered by a valid DPA and transfer mechanism. The EDPB has specifically called out that transfer disclosures in privacy notices must explicitly identify third-country recipients — generic language like "we may share with service providers" is no longer sufficient. EDPB transparency enforcement in 2026 specifically targets inadequate third-party and transfer disclosure.

Best Practices for Building a Defensible Position

Map every data flow that involves EU personal data leaving the EEA. This is the foundational step without which no transfer compliance position is credible. The map must be current — new vendor integrations, analytics tools, and infrastructure changes can introduce new transfer flows without engineering teams recognizing the legal implications.

Document the legal mechanism for each transfer. Where the DPF covers it, record the vendor's DPF certification. Where SCCs apply, ensure they are executed and current against the 2021 European Commission form (updated by Q2 2025 amendments). Where a TIA is required, complete it genuinely — not as a formality. The TikTok enforcement demonstrated that a TIA which fails to reach the right conclusion provides no protection; it merely documents the compliance failure.

Keep SCC backup mechanisms ready for any transfers that currently rely on the DPF. The DPF has survived one legal challenge, but the appeal is pending, political risks to the oversight mechanisms are real, and the history of EU-US transfer frameworks — Safe Harbor, Privacy Shield, DPF — suggests that organizations should not build operational dependencies that cannot be quickly switched to SCC-based compliance. Executing SCCs in parallel with DPF reliance is a low-cost insurance against the scenario that has already occurred twice.

Align vendor contracts with GDPR's Article 28 requirements, which specify what Data Processing Agreements must contain. Vendor DPAs need to address not just the direct processing relationship but also sub-processor chains — the third parties your vendors themselves use to deliver their services. Sub-processor lists should be kept current, and significant new sub-processors should trigger a review right in the DPA.

FAQ

Does GDPR require data to stay in the EU?

No. GDPR does not mandate data localization within the EU. It requires that personal data transferred outside the EEA be protected by a legal mechanism — adequacy decision, SCCs, or BCRs — that ensures essentially equivalent protection.

What are data residency requirements in the US?

There are no federal data residency requirements in the US for general personal data. Sectoral rules govern specific data categories (healthcare under HIPAA, certain financial data), and the DOJ's Data Security Program restricts bulk sensitive data transfers to entities in designated countries of concern. State privacy laws focus on consumer rights, not geographic data storage.

Can EU data be stored in the US?

Yes, provided the transfer is covered by a valid legal mechanism. US organizations certified under the EU-US Data Privacy Framework can receive EU personal data without additional safeguards. Transfers to non-DPF US organizations require SCCs plus a TIA, along with any appropriate supplementary measures.

What is the EU-US Data Privacy Framework?

The DPF is an adequacy decision adopted by the European Commission in July 2023, enabling the free flow of personal data from the EU to certified US organizations. It replaced the Privacy Shield, which was invalidated by the Schrems II decision in 2020. The DPF survived its first legal challenge in September 2025 but remains under appeal.

How do companies handle cross-border data transfers?

By mapping all data flows involving EU personal data leaving the EEA, identifying and executing the appropriate transfer mechanism for each flow, completing Transfer Impact Assessments where required, maintaining current Data Processing Agreements with all processors and sub-processors, and keeping backup mechanisms operational in case primary transfer tools are disrupted.

Data residency compliance is ultimately an engineering and governance problem dressed in legal language. The legal framework tells you which data flows require justification. The infrastructure and vendor decisions determine whether those flows actually exist and what mechanisms can cover them. Getting the two layers aligned — legal analysis informing architecture, architecture reviewed against legal requirements — is what makes a compliance position defensible when a regulator investigates.

See how Secure Privacy's data mapping, consent management, and privacy governance platform gives you the visibility and documentation infrastructure to manage cross-border transfer obligations at scale.

logo

Get Started For Free with the
#1 Cookie Consent Platform.

tick

No credit card required

Sign-up for FREE

image

Data Residency Requirements: EU vs US Explained

Your SaaS platform serves users in Germany, France, and California. Your infrastructure runs on AWS us-east-1. Your analytics vendor is headquartered in San Francisco. Your customer support tool uses a helpdesk provider with data centers in Virginia. Each of these arrangements involves the transfer or storage of personal data in ways that intersect with two fundamentally different regulatory philosophies — and the cost of misunderstanding those differences is climbing. Meta's €1.2 billion fine for unlawful EU-US data transfers remains the largest single GDPR penalty on record. TikTok absorbed €530 million in 2025 for failing to protect EEA user data from unauthorized access in China. Cumulative GDPR fines have now passed €7.1 billion.

  • Data Protection
  • Privacy Governance
image

California AI Regulations 2026: A Practical Compliance Guide

Your engineering team shipped a new AI feature three months ago. It screens job applicants, ranks them by predicted fit, and surfaces a shortlist for the hiring manager. Nobody called it "regulated."

  • Data Protection
  • AI Governance
image

Third-Party SDK Compliance: How to Audit and Control Data Sharing in Mobile Apps

Your app displays a consent banner on first launch. Your privacy policy lists the analytics and advertising partners you work with. Your App Store listing includes Apple's Privacy Nutrition Label with all data types correctly declared.

  • Legal & News
  • Data Protection