Our 2026 predictions series invites privacy leaders to reflect on what’s in store for 2026. This post was guest written by Steve Lappenbusch (Director of Product Management – Nonprofits, Daxko & former Head of Privacy at a global data broker), a member of the DataGrail Contributors program. Also check out predictions by Daniel Barber and Andy Dale.
I have built and led both product and privacy teams at data brokers and SaaS companies. and one thing is clear: 2026 will change your privacy work life in ways that go far beyond laws and regulations.
Privacy’s long-standing silos are starting to dissolve, several deeply held beliefs are about to be challenged, and many teams will realize that the “data broker boogeyman” may be closer to home than they expected.
1. Yes, you will be a data broker. No, you will not know what to do, but will probably think you do.
In 2026, data brokering will move into the mainstream. A little.
Companies will learn that normal operations both feed on and feed data into a complex global supply chain. I call it the privacy supply chain. More on that later.
Privacy will finally begin to move beyond its consumer, end-user experience bias to focus on and govern how companies sell, combine, enrich, sync, score, create and redistribute data in bulk through routine business tools (we’ll get back to this too).
And they’ll soon discover that data broker laws apply to them.
Most companies will assume they understand the rules. They won’t. And that they know what to do. They really won’t.
They will miss secondary uses and reseller supply chains. They’ll miss cloud integrations and partner enrichment. They’ll miss ad tech repurposing. They will miss data shared and sold from the business and data bought by the business. They will not grasp how complex their data flows have become. They will not understand the data brokering nature of the data maps their tools create for them. They will believe intent matters more than effect.
Privacy pros will face the difficult task of explaining to their C-suite, their customers, and their co-workers, that data brokering is mainstream, not an evil niche of data sorcery. They will need to show leaders why the “broker” label isn’t optional, and they’ll end up building governance for a role the company never planned to hold and that no one internally is qualified to lead.
In 2026, winners will admit a lack of data brokering task-level knowledge and their own uncertainty. They will seek experts to help. The companies that acknowledge data broker confusion first will move the fastest.
2. In the old school tradition of caring if things actually work, consent is finally questioned.
We will confront the gap between actual informed consent and the click-based consent facade we rely on today. True informed consent requires real understanding, knowing what will happen and what it means. This is not possible when data moves through a massive global supply chain at light speed.
Users cannot understand all the systems that touch their data. They cannot grasp all sharing, combining, and inferring. They cannot judge the risk created by actors whose actions they do not understand. Consent cannot handle this level of complexity.
Legislators will see the same truth. Maryland’s MODPA prohibits the sale of sensitive data even with consent. It treats some acts as too harmful to allow. It signals a shift away from user choice as the main safeguard.
The savviest privacy pros will welcome this shift in 2026. They know consent creates false comfort. They know it pushes responsibility to the person with the least information. And they know privacy cannot depend on endless pop ups or unread choices.
Consent will not disappear in 2026. But many in the field will stop pretending it solves the real problems.
3. AI sets privacy and product up on a blind date and legal becomes the third wheel.
In 2026, AI will enable traditional privacy tasks to finally begin, but just begin, to move into the everyday workflow of product managers. Privacy work that had once happened before or after development, or as a speed bump to sprint planning, will start to be embedded directly into normal development. Product and engineering teams will complete many tasks as part of their routine workflow. AI will finish more in the background.
AI agents will surface privacy prompts at the right moments, during PRD epic definition, and acceptance-criteria review. They will ask direct and simple questions that slow teams down just enough to think, but not enough to jeopardize a launch date.
AI will only just begin doing this in 2026. But, it will start and it will begin to bring privacy truly into the product lifecycle.
Privacy pros will gradually shift toward high judgment work managing privacy like the product it has always been. This shift will accelerate up development, reduce risk and mark the beginning of a big change.
With this AI-enabled shift, beginning first only in the most effective companies, legal will start to step back from daily privacy operations.
Legal will begin to play the same smaller but utterly critical role it plays in finance and HR: a key resource for high-risk issues, not the department running day-to-day operations. After all, lawyers are expensive, and they’re not needed for (or particularly well-suited to) managing large operational workflows.
Privacy managers will begin to oversee huge enterprises through AI while guiding product, legal, engineering, sales and marketing. The role will not be totally resized and realigned in 2026. The new relationship will begin in uneven fits and starts like any blind date that ends well.
4. The Private Right of Action is increasingly seen as toothless.
Deputizing the profit motive of VC-funded lawsuits as a form of human rights enforcement will never build a better society. This next year will prove it.
I have been an outlier (though not the only one) in opposing the private right of action for a long time. I have seen how it turns privacy into a profit engine instead of a protection. Lawyers make millions. Consumers get a few bucks. Companies make decorative changes.
Tanya Riley recently explained how these lawsuits create a financial windfall for plaintiff firms and little else. Lawsuits feign accountability. The desired outcome is settlement, not safer citizens.
Think about it. Apple settled a lawsuit for $95 million for secretly recording and selling the conversations between parents and children in people’s homes.
Consumers got nineteen dollars. Plaintiffs firms (presumably) got tens of millions. And Apple… is still Apple-ing.
In 2026, more privacy and tech professionals will recognize that lawsuits don’t fix anything because they’re cheap sop first party brokers like Apple and Meta prefer. They can price it into their next subscription increase. Meanwhile, smaller companies without the legal budget can’t compete.
In 2026, as settlements pile up, we will see proof private lawsuits just transfer consumer money to plaintiffs lawyers and do not improve privacy.
5. You will invite the privacy supply chain to the party, only to realize it waves way too many of its unsavory friends into your house.
In 2026, the industry will realize that privacy cannot be understood through a consumer lens. More than $18 trillion in B2B commerce moves data every year. a staggering amount, in ways most people never see or expect. Yet debates, laws and companies still center on cookie banners and check boxes.
Many companies fail to grasp how they are in the data supply chain at all.
Finally, the data privacy supply chain will become impossible to ignore. Companies will see how data moves through vendors, partners, enrichment tools, analytics systems, infrastructure providers, and AI services. Each layer creates new uses and new inferences. Most companies do not track any of this accurately, if at all.
Privacy pros will realize they need to become data privacy supply chain managers. They’ll map flows not simply to different tools, but across dozens of vendors and even more uses. They’ll require visibility into systems that were never designed to offer it. In learning about permissible purposes of their suppliers, vendors and customers, privacy pros will learn what data brokers have long known: wow, all this data is used in some, um, surprising ways.
Privacy pros will then ask pointed questions about data provenance. They’ll ask for empirical verification instead of performative statements on paper. Data due diligence will start to become as big a part of data as it is for money or M&A.
Companies that grasp the data privacy supply chain first will set the standards others follow. They’ll treat privacy as a consequence of the entire data supply chain, not a property of one user interaction.
Are you really ready for that?
Steve Lappenbusch is the Director of Product Management – Nonprofits at Daxko and a former Head of Privacy at a global data broker. Find Steve on Privacy Basecamp, our online community of 1,500+ privacy professionals around the world.
The opinions expressed in this article are solely those of the author and do not necessarily reflect the views of DataGrail.