Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

New CCPA Risk Assessment Requirements Now In Effect

By Michael Young & Beau Braswell on February 3, 2026
Email this postTweet this postLike this postShare this post on LinkedIn

Under newly implemented regulations of the California Consumer Privacy Act (CCPA), California now requires a formal risk assessment “before initiating any processing activity” of certain (sensitive) sorts. The regulation explicitly contemplates that businesses will complete risk assessments now, in 2026.

Eventually, such risk assessments – including those completed this year – must be signed by an executive and submitted to the California regulator under penalty of perjury.

Businesses and executives subject to the CCPA must prepare now to address these requirements. In particular, the regulation may impact businesses and services including SaaS/technology firms, payments or financial technology solutions, services, consumer services, employment or HR applications, AI solutions, or other processing involving California resident data.

The statute and regulations arguably provide for some narrowly tailored exceptions. Exceptions may include financial information subject to the Gramm-Leach-Bliley Act, certain limited employment-related uses, and/or certain health care institutions/information uses governed by HIPAA. However, relevant companies should consult competent legal counsel to assess whether they may fall within the scope of such an exception before relying on it.

Certain key requirements are noted below.

Any Processing of “Sensitive” Personal Information
Businesses must conduct a risk assessment of any processing of “sensitive” personal data. Such “sensitive” data includes:

  • SSN, driver’s license, state ID card, or passport number.
  • Financial account, debit card, or credit card numbers in combination with any required security or access code, password, or credentials allowing access to an account.
  • Precise geolocation of a consumer.
  • Race, ethnicity, citizenship or immigration status, religion or philosophy, or union membership.
  • Mail, email, or text messaging content (apparently including messages sent to the consumer).
  • Individual genetic data.
  • Neural data and/or measurements.
  • Biometric information processed for identification purposes.
  • Personal information collected or analyzed regarding health, sex life, or sexual orientation.
  • Children’s data (< 16 years of age).

Note that some of these categories (e.g., messaging content) may be trivially easy to meet for almost any business that interacts with or provides consumer services. Impacts are only potentially heightened for businesses operating with sensitive data, such as financial, health, or payment data.

ADMT: Use for a Significant Decision & Training

Businesses must also conduct a risk assessment regarding the use of “automated decision-making technology” (ADMT) for a “significant decision.” ADMT can include artificial intelligence technologies or technology intended to replace human involvement. An ADMT makes a “significant decision” when the decision “results in the provision or denial of financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or health care services.”

Business may be in scope of risk assessment requirements if, e.g., its products, services, or automated activities involve:

  • Providing risk scoring or assessments, or otherwise helping decide when to extend credit or a loan, to exchange funds, offer housing, or to set up installment payment plans.
  • Searching and sorting job candidates into an auto-reject category.
  • AI-based screening for health care services.
  • Other “significant decision.”

Risk assessments are also required for certain uses of personal information to train an ADMT that will be used for significant decisions, including facial recognition, emotion recognition, and/or identity verification.

Selling or Sharing Data

Businesses must conduct a risk assessment when “selling” or “sharing” data within the meaning of California law. Based on statutory definitions and prior enforcement by California authorities, note that “selling” and “sharing” can include ordinary online tracking and analytics, technologies common across many commercial websites. “Selling” and “sharing” can include other common activities, such as service provider arrangements that are not subject to the strict contractual controls under California law limiting personal data use. For consumer finance businesses, the regulation also specifically notes as an example that consumer budgeting calculators may involve regulated data “sharing” if, e.g., including a third-party advertisement.

Automated Processing to Infer

Businesses must conduct a risk assessment before using automated processing to infer certain categories of information related to a consumer, including economic situation, behavior, personal preferences, or interests. Businesses should consider a risk assessment given the use of AI or other technology to assess job candidates, perform analytics, or form other assessments of individuals regarding “intelligence, ability, aptitude, performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, or movements in a sensitive location

Conclusion

The updates to the CCPA regulations went into effect on Jan. 1, and businesses may now be required to perform a risk assessment before commencing the relevant processing. Content requirements for risk assessments are detailed and require identifying potential harms to consumers, offsetting benefits, and mitigating factors. The regulation provides detailed guidance on both the substance and the form of such assessments; existing assessment procedures are unlikely to meet California requirements unless specifically designed to do so. Finally, as noted above, assessments will ultimately have to be submitted to the California privacy regulator by a managing executive, along with a written statement under penalty of perjury that the risk assessment information submitted is true and correct.

For more information about the updated CCPA requirements, contact a member of Taft’s  Privacy, Security, and Artificial Intelligence and Technology and Artificial Intelligence groups.

Photo of Michael Young Michael Young

Michael advises and represents clients on complex privacy, AI and data protection issues. From pre-venture startups to some of the most recognizable brands in the world, whether strategic or transactional, Michael specializes in helping companies find answers that are right for them given…

Michael advises and represents clients on complex privacy, AI and data protection issues. From pre-venture startups to some of the most recognizable brands in the world, whether strategic or transactional, Michael specializes in helping companies find answers that are right for them given their unique challenges.

Show more Show less
Photo of Beau Braswell Beau Braswell

Beau has advised clients on data privacy and cybersecurity matters for more than eight years. He began his legal career in the U.S. Department of Justice, where he obtained a TS/SCI clearance and advised on data protection in the law enforcement and intelligence…

Beau has advised clients on data privacy and cybersecurity matters for more than eight years. He began his legal career in the U.S. Department of Justice, where he obtained a TS/SCI clearance and advised on data protection in the law enforcement and intelligence contexts.

Show more Show less
  • Posted in:
    Privacy & Data Security
  • Blog:
    Privacy & Data Security Insight
  • Organization:
    Taft Stettinius & Hollister LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo