As someone who’s been in the privacy world for the better part of the last decade, I think 2026 is going to be a pivotal year. In fact, I think that there are some big things coming that could completely shake everything up.
I’m always asked what am I seeing in the privacy space? What new laws are coming? How do I align my privacy program? Should I get privacy tech?
For that reason, I decided to embark on a project this year, which is to write what I’m seeing happening in the industry, so you can see around the corner and get ahead of big shifts that are happening in your business before they hit you like a tidal wave.
It will be a weekly email filled with my privacy perspective and helpful practical privacy ops tips.
Prefer to just get our bi-weekly Privacy Highlights? You can manage your preferences.
I’ll talk about what I see happening and my thoughts for 2026. But first, before we get into that, I want to introduce or reintroduce myself and tell you a little bit about why I think you might want to pay attention.
Who’s Jodi?
After a long career in accounting, finance and corporate strategy, I had another career pivot. This time by building a targeted ad network, basically stalking you online with car ads, at AutoTrader.com all the way back in 2008. That was before Facebook was doing it.
In fact, we had numerous companies looking to us for our automotive shopper data … I said no thank you. Before the 19+ state privacy laws and GDPR, there was self-regulation. In particular, the online advertising industry came together to form the Digital Advertising Alliance (DAA).
It was most known for those little blue triangles that say AdChoices (yes, it is still around but mostly with just “AdChoices” in the footer).

I was responsible for our compliance at AutoTrader.com. I still remember reading this little article in MediaPost:

From there, I built the first privacy program at what became Cox Automotive (big conglomerate in the automotive space – the industry calls it from cradle to grave) and then went to Bank of America where I was a privacy lead for many of the digital initiatives in retail banking.
In 2017, I recognized that there was a shift happening starting with GDPR (effective May 2018), and that privacy would grow in importance but that companies didn’t have the resources or capabilities to operationalize privacy.
That’s when Red Clover Advisors was born and we’ve been filling that void ever since.
Why Red Clover Advisors?
Red is bold and starting this venture was bold for me.
Clover is the intersection of data privacy, digital marketing and data strategy and we’re advisors to companies (and really partners for those who buy our fractional privacy officer services (curious what that is? Check it out here).
In the last 8 years, we’ve served hundreds of companies from startups to Fortune 100, produced hundreds of articles, guides, whitepapers, I co-host the popular She Said Privacy/He Said Security Podcast (5 years running), I wrote a WSJ best selling book, Data Reimagined: Building Trust One Byte At a Time, spoken at leading industry conferences such as IAPP, Privacy Security Forum, and been quoted in leading media like CNBC, FOX News, WSJ, The Economist, and so much more.
Confluence of Privacy, Data & AI
In the US, we’re up to 19 signed state comprehensive privacy laws, additional sectoral laws, AI governance laws and nearly 150 global privacy laws with GDPR still considered the gold standard.

Data now is cheap to store + the AI age leading companies to collect more data than ever. Consumers are starting to wonder just what is this company doing with my data? B2B companies aren’t doing business with vendors that can’t show compliance with privacy laws, aren’t protecting data, and are using data for themselves.
We’re at an inflection point where privacy regulation is overwhelming, there’s an increase in data collection and use within companies, and an erosion of trust among purchasers (yes, humans are still making B2B decisions).
95% of customers said they would refuse to buy from a company if data was not properly protected (Cisco’s 2025 Data Privacy Benchmark Study).

I’ve always believed companies should put the customer first and consider customer expectations. Customer’s buy a product or service because they trust it will fulfill their want or need.
They also expect their data not to be misused. We’re seeing “trust centers” appear – listing privacy, security and AI practices that are a signal to the outside world this is a company an individual can trust.
I love Grammarly’s trust center with easy to find sections covering privacy, security, compliance and responsible AI.

Since my cookie baking skills aren’t quite full-time career worthy, I’ve gone all in to simplify privacy for companies.
Balancing business risk with privacy requirements plus customer expectation is unique for each company.
I really believe that privacy should be right-sized for each company. In fact, this is a concept that I incorporate with every single one of our clients. It’s what we call “Right-Sizing” for risk and it’s a key part of our PrivacyOps Framework.
New privacy laws, regulations and enforcement actions are popping up everywhere. Making sense of it all for companies is what keeps me smiling. Which brings me to the current state of privacy.
Confluence of Privacy, Data & AI
The theme is MORE.
- MORE laws and amendments.
- MORE enforcement actions.
- MORE data processing.
- MORE operational complexity.
- MORE automation + humans.
- MORE media attention.
1. More Laws and amendments
I predicted that in 2025, we’d have 25 state comprehensive laws passed. It’s a good thing no one betted on that because that didn’t come to fruition. We were close to passing a new law in Massachusetts, but it didn’t pass. Instead of new laws passing, 9 states amended their existing privacy laws (Connecticut, Colorado, Oregon, Texas, Kentucky, Virginia, California, Utah, Montana).
In 2026, I think there is a good chance that new states will pass a comprehensive state privacy law. I also think we’ll have more amendments that will cover sensitive data, children’s data (we’ll watch this space closely), changing in-scope thresholds (like how Connecticut lowered from 100,000 to 35,000 consumers effective July 1, 2026), privacy rights, and AI which is sometimes covered in privacy laws.
And if it’s anything like prior years, expect them to drop at 5pm on a Friday or holiday weekend.We hear all the time it’s so challenging to keep up with it all and it really is. The laws and changes are coming fast and furious. It’s why we created this super handy State Map.

I’m always asked will there be a federal privacy law? I don’t think 2026 is the year we’ll see one. I’m keeping a close eye on federal activity and will update as things evolve.
2. More enforcement actions
As we’ve seen in California, regulators are active. In California, we had the CCPA cases against Honda, Todd Snyder, Healthline Media, Tractor Supply, Jam City, and Sling TV. California, Connecticut and New York AG offices teamed up to reach a settlement with Illuminate Education. Expect more collaboration among regulators, especially with the Consortium of Privacy Regulators of 10 states now.
We’re still waiting for the results from the California, Connecticut, and Colorado and global privacy control (GPC) sweep that was announced in September 2025.
Friendly reminder, we’re only seeing what regulators want us to see. As my many privacy attorney friends tell me, regulators are sending inquiries to companies that are not always made public AND they are talking to each other.

We’re always keeping our ears to the ground and pulse of what’s happening behind the scenes so I’ll continue to share these nuggets.
With increased enforcement, it also means that laws are moving from theory to execution, and regulators make it clear that “good enough on paper” is no longer enough. Showing your work is essential.
Now is also a good time to make sure you have the table stakes working: opt-outs – make sure that cookie consent software is functioning, updated privacy notice, and privacy rights form is accurate.
While writing this today, I saw a company launch a brand new website with a dark pattern in its cookie banner, a privacy notice 2.5 years old, and a webform that doesn’t have the proper workflows. If I can see it, so can a regulator. Don’t unnecessarily put a target on your back!
Get your seatbelts ready because we’ll see a steady increase in regulator activity. If you’re curious for what is top of mind for regulators and will be in Washington DC, on January 27, join me, as Red Clover Advisors is hosting alongside Ketch and Kelley Drye, a Privacy State of the Union.
This will be a front row seat to hear directly from regulators, industry thought leaders, and a chance to network with other privacy pros also trying to figure out this privacy thing. Register here now as seats are filling up fast!
3. More data processing
Privacy laws say collect the least amount of data as possible.
Business owners can track every click and want as much data as possible.
AI thrives on data to build its models and perform more accurately.

To comply with privacy laws’ data minimization obligations, companies need to understand the data collected, used, stored and shared. Then they can advise the business on how best to move forward.
The kind of data collected matters too.
Privacy requirements don’t always line up across laws. For example – sensitive data in one state isn’t the same as it might be in another. How kids data is processed isn’t the same either.
How do companies work through these nuances? By starting with a data inventory. It’s a company’s starting point and foundation for a privacy program.
And you might not be surprised that I have a lot of strong feelings on data inventories, how to conduct them and maintain them which I’ll cover in future issues.
With the introduction of AI, personal, confidential and public data is going into AI systems at a fast pace. Look at the chart below showing nearly 1 trillion datapoints being used to train AI models. For companies with personal information, it’s important to determine if personal information can or can’t be shared in training models, especially public ones.
As a privacy professional, I recommend not sharing any personal or confidential information with training models that can be used for other companies. This is common and it takes reading the fine print in the terms & conditions and privacy notices to figure out what’s happening.

4. More operational complexity
Privacy laws are not getting simpler. Companies have to manage across varying definitions of sensitive data, children’s data, cookie consent requirements, privacy litigation concerns, and an ever changing business environment.
That means getting PrivacyOps right is essential. It means knowing which laws are in scope and then how you’ll treat the people out of scope or when a law differs.
For example, I live in Georgia where there is no privacy law.
Should a company allow me to exercise a privacy right request?
Treat my data as sensitive as defined by another law?
What about a state that is opt-out for sensitive data (like California) versus a state that is opt-in (like Colorado, Connecticut or Virginia) – companies have to consider how it will manage those contradictions.
PrivacyOps covers how companies are actually executing on its privacy requirements.
In fact, we built an entire framework around these requirements, our PrivacyOps Framework. We’ll dive deeper into each of them in future issues.

As an overview and to plan for 2026, companies need to have an actual process for conducting its data inventory, privacy risk assessments (hello CCPA privacy risk assessment requirements), honoring privacy rights, reviewing third party risk assessments, managing cookie and all digital tracker governance which includes cookie consent requirements, training, and notices, both internal and external.
Regulators might look first at the outside – the privacy notice, the privacy rights webform, and the cookie consent setup. They are then looking deeper and ask companies how do you know what data you are processing? How did you review this high risk activity? How did you review a vendor?
Just like in math class, showing your work is important. It’s not just about the answer – it’s about the thought process and how you got there. That’s why documentation is required under laws like GDPR and now also in the US, like CCPA.
We often find that companies might have a template but not a thorough and repeatable process to know when an assessment should be performed, how it’s reviewed, what risks are identified and how those risks are mitigated. With a solid assessment in place, when a new law or amendment comes, it will be easy to review, add the appropriate questions or note which risks to look for.
Our clients who have this in place tell us it’s then not a huge monumental lift with each new change.
Cookie consent is one of the areas that needs ongoing love and attention. So often companies set it up and forget about it. Websites change and that can impact the consent software code causing it not to work right. New cookies and pixels are added (by you or by a third or fourth party) and those are not properly categorized).
Cookies are sometimes dropping even when they are supposed to be blocked. Tag managers change and impacts the software code.
We have yet to find a website where there is not something not working properly.
5. More automation + humans
Many companies are looking to privacy tech to automate processes like data inventory, privacy rights and privacy risk assessments. There is also talk about agentic AI. We do a lot of privacy tech implementations and I’m all for automation and making compliance more efficient.
For companies who have no data inventory, or no privacy risk assessment process, or even no one submitting a privacy rights request, a manual process might be the right fit. I’d much rather a company do something than nothing.
Automation is out of reach for reasons such as costs, resources, or the ability to find the right tech that will hook up to their systems. Even if automation takes place, there’s still the need to talk to humans to understand HOW the data is being used, how it was collected in the first place, and consider is it in a privacy notice, or covered properly in a contract or terms.
One example where I think humans are still needed even with automation in place, is asking should the company be using the data that it’s processing. It might be legal, it might follow company policy and in compliance with a privacy law.
But it might not be expected by the customer and for that reason, it should not be collected, used, or shared. Automation can help flag other privacy risks and I think humans should be reviewing them.
For those companies who have been doing manual work, now is a great time to consider automation. For example, helping to find all the data elements in systems or help connect data inventories to privacy risk assessments (aka privacy impact assessments, PIAs).
A process and privacy review needs to be created and executed alongside the automation for a complete picture. Companies are savvy at marketing and many will tout complete automation.
Since humans are involved in deciding how data should be used, humans should also be a part of reviewing its privacy risks. Automation can help flag risks and perhaps even make recommendations, but I don’t think it should take it over completely.
There’s still significant room for the humans.
6. More media attention
Consumers care about privacy. Fines also make great headlines. When a company uses data that wasn’t expected, it makes headlines. The media loves a juicy headline and there’s no shortage to pick from with companies getting fined, more IOT devices using data, and more AI companies using data.
When Honda received it’s CCPA enforcement fine? It made the headlines.

When a vacuum was caught “spying” on people, it made the news (oh and did you hear that Roomba also recently went bankrupt?)

Smart appliances have been questioned too and many consumers aren’t quite sure they want data greedy appliances.

Remember 23andme? When It went bankrupt, many people realized that they had given a private company their personal data and now wondered what happened to it.

Or how an AI tool like ChatGPT uses your data?

Expect More News Stories
States also are pushing education to inform consumers of their privacy rights. This too will get picked up by the media. The more privacy is in the media, the more that consumers will learn what their rights are, ask more questions about companies, and a cycle ensues.
What Should You Do Next?
If you’re a privacy pro reading this, now is the time to create the 2026 plan on how you’ll keep up with the privacy laws, amendments and regulations that will keep coming.Identify the core privacy operational areas that need focus this year – perhaps it’s starting on them or improving what’s already started. We find that people love printing our 2026 privacy checklist and using it as their personal guide.

If you need resources such as privacy tech, outside advisors, additional team members, or even just buy-in within your company, begin creating those relationships now. Preparation will go a long way for when you need them.
Privacy matters. Your company and customers are counting on you. I’ve made it my mission to reduce the overwhelm, simplify what you need to do and provide practical privacy advice.
What’s most important is that I touch on topics that you’re most interested in and perhaps most concerned about right now.
Maybe the areas that you lack clarity, or would love, better direction.
So, my question that I’ll wrap with here today is this. What’s your biggest privacy topic that you are stuck on right now?
Send me an email to let me know what you’d like to see in upcoming newsletters.
I personally read each and every reply (yes, really I do. No robots or outsourcing, just me!)
So, I encourage you, before you do anything else, hit the reply button.
Until next time,
Jodi
The post Welcome to Privacy Perspectives appeared first on Red Clover Advisors.