I had the opportunity to attend the IAPP Global Privacy Summit in Washington, D.C.—a dynamic gathering of privacy professionals from around the world. While sessions covered everything from AI governance to enforcement trends, a few consistent themes emerged—in panels, hallway chats, and moments of hard-earned insight.
Here are my top takeaways from this year’s Summit:
1. Privacy Starts with Empathy
In our health tech privacy panel, Christina Moncrief posed a simple challenge: “How would you feel if this were your data?”
That question reframes privacy from a compliance exercise into a human one. It urges us to lead with ethics, not obligation—and when we do, privacy becomes not just credible, but aligned with business goals. Empathy builds trust. And trust builds momentum.
2. Start with Harm, Not Just Risk
Meredith Halama of Perkins Coie encouraged a shift in mindset: begin with the real-world harm data misuse can cause, then build your privacy program from there.
This harm-first approach puts consumer impact ahead of legal exposure. It’s more principled, more resilient, and better aligned with where regulators—and plaintiffs’ attorneys—are likely headed.
3. Proximity to Product Is Power
To influence outcomes, privacy professionals need to be embedded—not just consulted. That means building strong, trusted relationships with product and engineering teams from the start.
Speakers emphasized that lean privacy teams can actually be an asset: more nimble, more integrated, and more trusted by their partners. To be seen as enablers—not blockers—privacy leaders need to understand the tech, the roadmap, and the tradeoffs. And be there early.
4. Geofencing and Location Data Face Growing Scrutiny
Location data, especially in sensitive contexts like healthcare, is emerging as a regulatory flashpoint.
Companies using geofencing should reassess how they collect, store, and share location data—with a focus on necessity and proportionality. This is one area where enforcement is just getting started.
5. The FTC Is Looking at Downstream Use
FTC regulators made it clear: the agency isn’t chasing new rules—it’s focused on enforcing existing norms and stepping in where the market fails consumers.
They also spotlighted a key emerging question: What responsibility do companies have for how their tools are used after deployment? With AI especially, enforcement may increasingly hinge on use—not just design.
6. Pixels Are Now a Legal Minefield
Once a technical detail, pixel tracking is now front-line legal risk. Especially in healthcare and adtech, default placements are being challenged under VPPA, wiretap statutes, and even common law.
If your pixels are embedded by default, reassess now. What once felt like convenience could be construed as surveillance by design.
7. Consent and Chatbots: Hidden Risks
Consent banners and user disclosures are under renewed scrutiny—particularly in health-adjacent services.
A few key watchouts:
- Misfiring social media pixels can capture sensitive data without valid consent
- Chatbots are collecting user inputs that often include health-related data—sometimes unknowingly
Regulators are paying closer attention to real user flows, not just what’s written in a policy.
8. Health Data Requires Precision—and Respect
If your business touches health data, clarity is critical. You need to know exactly what’s being collected—including data that could infer a health condition.
Consent isn’t just a checkbox—it’s a relationship. Communicate clearly. Offer control. Trust is built on transparency, especially when dealing with health-related information.
9. Privacy Programs Must Evolve
A functioning privacy program is no longer optional—it’s a prerequisite for doing business and earning trust.
In today’s environment, privacy must be agile, integrated, and strategic. With shifting regulations and rising expectations from partners and customers, privacy isn’t just compliance—it’s a competitive edge.
Final Thought: Embed Privacy, Don’t Bolt It On
The most resonant theme at the Summit? Privacy must evolve from checklists to strategic enablers.
That means:
✔️ Prioritizing real-world harm
✔️ Embedding ethical decision-making in product design
✔️ Partnering early with engineering and product
✔️ Anticipating misuse—not just intended use
When privacy is done right, it doesn’t slow things down—it makes everything stronger.
Thanks to the IAPP for another exceptional Summit—and to the many voices advancing our field with insight, empathy, and integrity.