{
    "componentChunkName": "component---src-templates-post-js",
    "path": "/blog/california-ai-regulations-2026",
    "result": {"data":{"allPrismicBlogpostpage":{"edges":[{"node":{"uid":"california-ai-regulations-2026","type":"blogpostpage","lang":"en-gb","id":"cf811e97-f738-5b17-a4cc-5795d204c447","alternate_languages":[],"data":{"activate_public_scanner_cta_header":false,"metadescription":{"text":"Learn how California AI regulations in 2026 impact your business. CPRA, automated decision-making rules, and compliance steps explained for legal, product, and engineering teams."},"metatitle":{"text":"California AI Regulations 2026: CPRA, ADMT & AI Compliance Guide"},"categories":[{"is_pilar_page_":true,"table_of_content_title":{"richText":[]}}],"backgroundpreview":{"alt":"secure privacy logo","url":"https://secure-privacy.cdn.prismic.io/secure-privacy/6b014258-aa3b-49d3-9bf0-fc6cfafbd2b7_logo-technology.svg?ixlib=gatsbyFP&auto=compress%2Cformat&fit=max&q=45"},"title":{"text":"California AI Regulations 2026: A Practical Compliance Guide"},"preview":{"alt":null,"url":"https://images.prismic.io/secure-privacy/adOzdZGXnQHGZSIq_cali.png?ixlib=gatsbyFP&auto=format%2Ccompress&fit=max&q=45"},"date":"2025-04-08","canonical":{"text":"https://secureprivacy.ai/blog/california-ai-regulations-2026"},"body":[{"id":"eec6c232-9aeb-5470-bb82-c6a41ea0fd01","slice_type":"text","primary":{"text":{"richText":[{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Yet.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"As of January 1, 2026, that model is subject to the most comprehensive AI governance framework any US state has enacted. If your business meets CCPA thresholds and that model makes decisions without meaningful human involvement — you're already operating under California's Automated Decision-Making Technology regulations.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The compliance clock didn't start when you noticed.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Fines run up to $2,500 per unintentional violation and $7,500 per intentional violation — applied per consumer. A hiring system that has screened thousands of applicants compounds that exposure rapidly. The California Privacy Protection Agency currently has hundreds of active investigations open.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The question isn't whether California regulates AI. It's which of your systems triggered which obligations, and when.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading4","text":"TL;DR","spans":[{"start":0,"end":5,"type":"strong"}],"direction":"ltr"},{"type":"list-item","text":"California's CCPA ADMT regulations — finalized September 2025, effective January 1, 2026 — impose pre-use notices, opt-out rights, access rights, and risk assessments on businesses using automated decision-making for significant decisions affecting California consumers.","spans":[],"direction":"ltr"},{"type":"list-item","text":"\"Significant decisions\" cover employment, financial services, housing, healthcare, and education. Advertising was explicitly excluded from the final text.","spans":[],"direction":"ltr"},{"type":"list-item","text":"Businesses already using ADMT have until January 1, 2027 to implement notice and opt-out requirements. Risk assessments should have started January 1, 2026. CPPA attestations are due April 1, 2028.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"}]}}},{"id":"d5c87b21-0eb0-5128-94ea-217d24c9f0b0","primary":{"cta_options":"CTA Header","blog_page_cta_button_link":{"url":"https://deft-thinker-159.kit.com/privacy-by-design-checklist"},"blog_page_cta_button_text":{"richText":[{"type":"paragraph","text":"DOWNLOAD YOUR PRIVACY BY DESIGN CHECKLIST","spans":[],"direction":"ltr"}]},"cta_header_title":{"richText":[]},"cta_header_description":{"richText":[{"type":"paragraph","text":"Prioritizing user privacy is essential. Secure Privacy's free Privacy by Design Checklist helps you integrate privacy considerations into your development and data management processes.","spans":[],"direction":"ltr"}]},"logo":{"url":"https://images.prismic.io/secure-privacy/ZiJ6NfPdc1huKpCp_Group481491.png?ixlib=gatsbyFP&auto=format%2Ccompress&fit=max&q=45","alt":null}},"slice_type":"blog_details_page_cta_button"},{"id":"e73bc3bd-1f80-546d-9e54-c85548c86d96","slice_type":"text","primary":{"text":{"richText":[{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"The California AI Regulatory Landscape in 2026","spans":[{"start":0,"end":46,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"California didn't pass one AI law. It built a layered system — AI-specific obligations stacked on top of CCPA/CPRA, targeting different use cases at different entities.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The operationally critical layer for most businesses: the CPPA's ADMT regulations. Adopted unanimously July 24, 2025. Approved by the Office of Administrative Law September 22, 2025. Effective January 1, 2026.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"These sit inside the CCPA framework. Same applicability thresholds:","spans":[],"direction":"ltr"},{"type":"list-item","text":"Annual gross revenue exceeding $26.6 million","spans":[],"direction":"ltr"},{"type":"list-item","text":"Processing personal information of 100,000+ California residents or households annually","spans":[],"direction":"ltr"},{"type":"list-item","text":"Deriving 50%+ of revenue from selling or sharing personal information","spans":[],"direction":"ltr"},{"type":"paragraph","text":"On top of that: the Transparency in Frontier Artificial Intelligence Act (SB 53, effective January 1, 2026). This targets developers of \"frontier models\" — systems trained using more than 10^26 floating-point operations. Penalties reach $1 million per violation. If you're not building frontier models, this doesn't apply directly — but it does impose supply chain obligations on companies using frontier model APIs, requiring those providers to document safety practices.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Also in effect: the Generative AI Training Data Transparency Act (AB 2013, effective January 1, 2026), requiring publicly accessible generative AI systems trained on personal information to publish training dataset disclosures. The AI Transparency Act (SB 942) — requiring AI platforms with over one million monthly users to offer AI-content detection tools — takes effect August 2, 2026.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"One deliberate choice in the final July 2025 regulations: the CPPA removed every reference to \"artificial intelligence\" from the ADMT text. Replaced it with a functional definition — technologies that use computation to replace or substantially replace human decision-making. The compliance question isn't \"does this use AI?\" It's \"does this technology replace human judgment in covered contexts?\"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Understanding how California's evolving CCPA requirements interact with AI processing — including what the 2026 amendments change for businesses using automated processing at scale — is the foundation for any California AI compliance assessment.","spans":[{"start":0,"end":180,"type":"hyperlink","data":{"link_type":"Web","url":"https://secureprivacy.ai/blog/ccpa-requirements-2026-complete-compliance-guide","target":"_blank"}}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"What Is ADMT and What Does \"Significant Decision\" Mean?","spans":[{"start":0,"end":55,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"ADMT: any technology that processes personal information and uses computation to replace human decision-making or substantially replace human decision-making.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Simple rule-based algorithms. Sophisticated ML models. Both qualify — if they operate on personal information and produce outputs that replace or substantially replace human judgment.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The scope is bounded by \"significant decisions.\" Consumer rights — pre-use notice, opt-out, access — only activate when ADMT makes a significant decision about a consumer. Significant decisions are those that provide, deny, or set terms for:","spans":[],"direction":"ltr"},{"type":"list-item","text":"Financial or lending services","spans":[],"direction":"ltr"},{"type":"list-item","text":"Housing","spans":[],"direction":"ltr"},{"type":"list-item","text":"Insurance","spans":[],"direction":"ltr"},{"type":"list-item","text":"Education","spans":[],"direction":"ltr"},{"type":"list-item","text":"Employment or independent contracting — including hiring, work assignment, compensation, promotion, demotion, suspension, and termination","spans":[],"direction":"ltr"},{"type":"list-item","text":"Healthcare services","spans":[],"direction":"ltr"},{"type":"list-item","text":"Access to essential goods and services","spans":[],"direction":"ltr"},{"type":"paragraph","text":"What's excluded: advertising. A recommendation algorithm surfacing products is not making a significant decision. A credit scoring model approving or denying a loan is. A personalisation engine ranking content is not. An algorithm screening job applications before a human sees them is.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"This boundary matters enormously for adtech and martech companies who feared earlier draft versions would capture behavioural advertising.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Profiling is explicitly included — technologies that analyse or predict intelligence, ability, aptitude, job performance, reliability, or similar characteristics. Performance management tools, skills assessment platforms, and behavioural analytics systems used in employment contexts fall squarely within the regulations if they substantially replace human judgment.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The meaningful human involvement exit. Businesses can exit ADMT obligations for a specific system — but only if the human decision-maker can interpret the ADMT's outputs, actually analyses those outputs alongside other information, and has genuine authority to make or change the decision independently of the ADMT's recommendation. A hiring manager who invariably accepts the AI-generated ranking without independent analysis is not providing meaningful human involvement, even if override authority technically exists.","spans":[{"start":0,"end":38,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"Consumer Rights Under the ADMT Regulations","spans":[{"start":0,"end":42,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Three consumer rights attach to ADMT used for significant decisions.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Right to pre-use notice. Before collecting personal information for ADMT use in a significant decision — or before using already-collected data for such a decision — businesses must provide a notice describing how the ADMT works, what data influences its outputs, and how outputs feed into the decision process. The notice must also describe the alternative process for consumers who opt out. That alternative must exist before you can lawfully operate the ADMT pathway. Pre-use notices can be delivered alongside existing CCPA privacy notices.","spans":[{"start":0,"end":24,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Right to opt out. Consumers must receive at least two opt-out methods — accessible, not buried in consent banners. Narrow exceptions exist for fraud prevention or safety purposes, or where businesses have implemented specific evaluation and safeguard requirements. These exceptions require documented compliance, not assumption. The opt-out right applies to ADMT for significant decisions only; it does not override ADMT used for non-significant decision purposes.","spans":[{"start":0,"end":17,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Right to access. Consumers may request information about the ADMT used against them — including its logic, the parameters that generated outputs, and the specific output as it related to their case. Businesses must respond within CCPA's standard response timelines.","spans":[{"start":0,"end":16,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"}]}}},{"id":"4adc2903-39dc-5eee-90ca-4e266e02c061","primary":{"cta_options":"CTA Header","blog_page_cta_button_link":{"url":"https://deft-thinker-159.kit.com/privacy-by-design-checklist"},"blog_page_cta_button_text":{"richText":[{"type":"paragraph","text":"DOWNLOAD YOUR PRIVACY BY DESIGN CHECKLIST","spans":[],"direction":"ltr"}]},"cta_header_title":{"richText":[]},"cta_header_description":{"richText":[{"type":"paragraph","text":"Prioritizing user privacy is essential. Secure Privacy's free Privacy by Design Checklist helps you integrate privacy considerations into your development and data management processes.","spans":[],"direction":"ltr"}]},"logo":{"url":"https://images.prismic.io/secure-privacy/ZiJ6NfPdc1huKpCp_Group481491.png?ixlib=gatsbyFP&auto=format%2Ccompress&fit=max&q=45","alt":null}},"slice_type":"blog_details_page_cta_button"},{"id":"03eab666-f048-5baa-b61b-a3c4ddfde916","slice_type":"text","primary":{"text":{"richText":[{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"Risk Assessments: What Must Be Done and When","spans":[{"start":0,"end":44,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Risk assessments are a separate obligation. Different timeline.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Businesses must complete assessments before initiating processing activities that present significant privacy risk — including ADMT for significant decisions, personal information used to train ADMT, and automated processing used to infer sensitive personal information.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"For processing already underway before January 1, 2026: assessments due by December 31, 2027.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"For new processing beginning on or after January 1, 2026: assessments due before processing begins.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Each assessment must evaluate: the ADMT's purposes and benefits, its logic, foreseeable negative impacts, planned safeguards, and policies to limit those impacts. If privacy risks outweigh benefits, the business may not proceed unless risks can be sufficiently mitigated. Assessments must be certified by a senior executive and retained for five years. Review required every three years minimum, or within 45 calendar days of any material change.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"By April 1, 2028, businesses must submit an attestation to the CPPA confirming assessments were completed, alongside a summary of assessment findings. The CPPA will know who has and hasn't done this. Enforcement investigators will cross-reference it.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Structuring these risk assessments as integrated documentation that satisfies both California requirements and EU AI Act obligations for organisations operating transnationally reduces duplicated effort and produces governance infrastructure that serves multiple regulatory frameworks simultaneously.","spans":[{"start":0,"end":176,"type":"hyperlink","data":{"link_type":"Web","url":"https://secureprivacy.ai/blog/eu-ai-act-implementation-guide","target":"_blank"}}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"The Compliance Workflow: Five Practical Steps","spans":[{"start":0,"end":45,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Most organisations' AI inventories don't exist in a form that makes these obligations easy to assess. Models are owned by different teams, documented inconsistently, and described by their technical architecture — not their regulatory function. Compliance begins with clarity.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Step one: Inventory every AI system and algorithmic tool that processes personal information of California residents. Engineering, product, and legal must work through the actual deployed technology stack together — not documentation that may not reflect reality. Capture: what personal information each system uses as input, what outputs it produces, how those outputs feed into decisions, and what decision domain applies.","spans":[{"start":0,"end":56,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Step two: Map each system against the ADMT definition and significant decision taxonomy. Systems processing personal information and producing outputs that replace or substantially replace human judgment in finance, employment, housing, healthcare, or education are in scope. Systems supporting advertising, content recommendation, or internal analytics without making significant decisions about individual consumers are likely out of scope for consumer rights provisions — though they may still trigger risk assessment obligations if they infer sensitive personal information.","spans":[{"start":0,"end":88,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Step three: Assess risk level for each in-scope system. Does it use sensitive personal information — health data, racial or ethnic origin, biometric identifiers, precise geolocation, sexual orientation, immigration status? Does it operate on protected class characteristics in employment? Does it affect large numbers of California consumers? High-risk systems require completed assessments before deployment (new systems) or by December 31, 2027 (existing systems).","spans":[{"start":0,"end":55,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Step four: Implement consumer-facing infrastructure. Pre-use notices that accurately describe the ADMT. Opt-out mechanisms that are accessible and functional — not buried in privacy policy sub-pages. An alternative decision-making pathway for consumers who opt out. An access request handling workflow for ADMT-specific inquiries. Required for all covered ADMT by January 1, 2027 for existing systems; prior to deployment for systems implemented after that date. Implementing AI consent and control workflows that actually enforce opt-out choices at the processing layer — not just record them in a database — is the technical implementation challenge most legal teams underestimate.","spans":[{"start":0,"end":52,"type":"strong"},{"start":462,"end":607,"type":"hyperlink","data":{"link_type":"Web","url":"https://secureprivacy.ai/blog/ai-personal-data-protection-gdpr-ccpa-compliance","target":"_blank"}}],"direction":"ltr"},{"type":"paragraph","text":"Step five: Build documentation and audit readiness. Risk assessments must be retained, certified, reviewed, and updated on schedule. ADMT inventories must stay current as deployments evolve. Third-party vendors providing ADMT to your business don't absorb your compliance obligation — you remain responsible and must collaborate with vendors to access the information about ADMT logic that consumer access rights require.","spans":[{"start":0,"end":51,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"California vs the EU AI Act: Two Different Logics","spans":[{"start":0,"end":49,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Same high-risk domains. Structurally different compliance mechanisms.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The EU AI Act is product-focused and provider-centric. It classifies AI systems by inherent risk tier and regulates the systems themselves — conformity assessments, technical documentation, EU database registration, human oversight requirements. Primary obligations fall on providers. High-risk categories under Annex III substantially overlap California's significant decision domains — employment, credit, healthcare, education — and additionally cover biometric identification, critical infrastructure, and law enforcement.","spans":[{"start":17,"end":54,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"California's ADMT framework is consumer-focused and deployer-centric. It doesn't regulate AI systems as products or require conformity assessments. It grants California consumers specific rights regarding how businesses use ADMT against them — and requires businesses to document that ADMT use is justified. Obligations fall on deployers: businesses using the technology, not developers building it.","spans":[{"start":31,"end":69,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"The practical consequence for businesses subject to both: EU AI Act technical documentation, risk assessment, and human oversight requirements are more demanding for in-scope systems than California's. But California's consumer-facing obligations — pre-use notice, opt-out mechanisms, access rights — require operational implementation the EU conformity framework doesn't specifically mandate in the same form.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Build a complete EU AI Act compliance programme for a high-risk system and you'll satisfy a significant portion of California's risk assessment requirements as a byproduct. You'll still need to implement California-specific notice and opt-out infrastructure separately.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"Integrate the assessments. Don't run parallel programmes.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"Common Mistakes That Generate Compliance Risk","spans":[{"start":0,"end":45,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Assuming vendor use transfers compliance obligation. It doesn't. California's ADMT regulations are explicit: a business cannot escape liability by contracting with a third-party ADMT provider. The business using that vendor's system to make significant decisions remains responsible for pre-use notices, opt-out mechanisms, access rights, and risk assessments. And must collaborate with the vendor to access ADMT logic information that access rights require.","spans":[{"start":0,"end":52,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Classifying systems by internal name, not regulatory function. Teams describe their tools as \"recommendation engines,\" \"scoring tools,\" \"efficiency platforms.\" The regulations don't care. A performance management tool that assigns numeric scores to employees — and that managers routinely accept without independent analysis — is making a significant employment decision. Internal naming changes nothing.","spans":[{"start":0,"end":62,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Deploying ADMT before building the alternative pathway. The opt-out right is meaningful only if a genuine non-ADMT alternative exists. Businesses with no process for making employment or credit decisions without ADMT cannot legally offer an opt-out they cannot honour. The alternative pathway must exist before the ADMT pathway goes live.","spans":[{"start":0,"end":55,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"heading2","text":"FAQ","spans":[{"start":0,"end":3,"type":"strong"}],"direction":"ltr"},{"type":"heading4","text":"Does California regulate AI? ","spans":[{"start":0,"end":28,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Yes. California regulates AI through multiple laws: the CCPA's ADMT regulations (effective January 1, 2026 for risk assessments, January 1, 2027 for consumer rights), the Transparency in Frontier AI Act (effective January 1, 2026 for frontier model developers), and the Generative AI Training Data Transparency Act (effective January 1, 2026 for generative AI developers).","spans":[],"direction":"ltr"},{"type":"heading4","text":"What is ADMT under CPRA? ","spans":[{"start":0,"end":24,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Automated Decision-Making Technology is any technology that processes personal information and uses computation to replace or substantially replace human decision-making. It includes profiling technologies that analyse or predict human characteristics.","spans":[],"direction":"ltr"},{"type":"heading4","text":"Do I need to disclose AI usage to users? ","spans":[{"start":0,"end":40,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Yes, if your ADMT makes significant decisions concerning California consumers. Pre-use notices are required before collecting personal information for ADMT use, or before using already-collected data for a significant decision via ADMT.","spans":[],"direction":"ltr"},{"type":"heading4","text":"Can users opt out of AI decisions? ","spans":[{"start":0,"end":34,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"Yes, for significant decisions. Businesses must provide at least two accessible opt-out methods and must have a non-ADMT alternative process in place for consumers who exercise that right.","spans":[],"direction":"ltr"},{"type":"heading4","text":"How does California compare to the EU AI Act? ","spans":[{"start":0,"end":45,"type":"strong"}],"direction":"ltr"},{"type":"paragraph","text":"The EU AI Act is product-focused and provider-centric, regulating AI systems through conformity assessments and technical documentation. California's ADMT framework is consumer-focused and deployer-centric, granting consumer rights and requiring risk justification. Both cover similar high-risk decision domains through structurally different compliance mechanisms. Organisations subject to both should integrate their assessments rather than running parallel programmes.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"},{"type":"paragraph","text":"California's AI compliance obligations aren't approaching.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"They're here. Risk assessments required from January 1, 2026. Consumer rights infrastructure due by January 1, 2027. CPPA attestations due by April 1, 2028. A regulator with active investigations in the hundreds and the institutional knowledge to find compliance gaps.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"The window for proactive preparation is now — not after the first enforcement action lands.","spans":[],"direction":"ltr"},{"type":"paragraph","text":"See how Secure Privacy's AI governance platform and consent management tools help businesses inventory their ADMT systems, implement consumer-facing controls, and maintain the audit-ready documentation California's new regime demands.","spans":[{"start":0,"end":234,"type":"hyperlink","data":{"link_type":"Web","url":"https://secureprivacy.ai/","target":"_blank"}}],"direction":"ltr"},{"type":"paragraph","text":"","spans":[],"direction":"ltr"}]}}},{"id":"6bb4703a-221f-57bf-b32a-a7cb5d978117","slice_type":"centralized_cta_from_blog_single"},{"id":"177f48c4-ba32-5f84-8612-bf5c94b3fa88","slice_type":"articles","primary":{"title":{"richText":[{"type":"heading2","text":"Blog Posts\nThat also interest you","spans":[{"start":11,"end":33,"type":"strong"}]}]},"buttontext":{"richText":[]}}}],"description":{"text":"Your engineering team shipped a new AI feature three months ago. It screens job applicants, ranks them by predicted fit, and surfaces a shortlist for the hiring manager. Nobody called it \"regulated.\" "}},"tags":["Data Protection","AI Governance"]}}]},"allPrismicBlogpage":{"edges":[{"node":{"uid":"blog","type":"blogpage","lang":"en-gb","id":"8be6fe51-0ae2-581d-9e23-8b00e02986c1","data":{"cta_button_text":{"richText":[{"type":"paragraph","text":"Sign-up for FREE","spans":[],"direction":"ltr"}]},"cta_button_link":{"url":"https://cmp.secureprivacy.ai/onboarding"},"cta_banner_text":{"richText":[{"type":"paragraph","text":"No credit card required","spans":[],"direction":"ltr"}]},"cta_banner_heading":{"richText":[{"type":"paragraph","text":"Get Started For Free with the\n#1 Cookie Consent Platform.","spans":[{"start":16,"end":20,"type":"strong"}],"direction":"ltr"}]}}}}]}},"pageContext":{"id":"cf811e97-f738-5b17-a4cc-5795d204c447","uid":"california-ai-regulations-2026","lang":"en-gb","type":"blogpostpage","url":"/blog/california-ai-regulations-2026"}},
    "staticQueryHashes": ["106289065","1254728886","1714079170","2867542246","3445072782","764283450"]}