close
close
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trends

Recap: What the Sephora Fine Tells Us About Future Enforcement

DataGrail, September 20, 2022

The recent Sephora fine levied by the state of California sent shockwaves through the privacy community. Many privacy practitioners speculated about what would be the first major CCPA enforcement action taken by the state, many expecting the Attorney General to focus on children’s privacy issues.

Instead, the AG took a stand on how website publishers use third-party web analytics and ad targeting services in spite of California’s privacy regime. Rob Bonta made it clear that it’s no longer okay for companies to exchange people’s data for non-monetary, but still valuable gain without notifying consumers or giving them the opportunity to opt-out. 

The settlement was part of ongoing investigative sweeps. In announcing the settlement with Sephora the AG’s Office provided a new batch of enforcement case examples flagging opt-out and transparency issues across multiple industries. 

To further the conversation, DataGrail CEO Daniel Barber brought together Rick Arney (co-author of the CCPA/CPRA) and DataGrail CTO Cathy Polinsky (formerly of StitchFix and Shopify) on September 8, 2022, to discuss the Sephora fine and the future of CPRA enforcement on LinkedIn Live.

Where It All Began: CCPA & CPRA 

To set the stage, Rick Arney shared some background on the CCPA and its successor, the CPRA. It’s incredible to think that such landmark legislation could start from the initial conversations of two frustrated neighbors whose kids went to school together. However, hearing directly from Rick illuminated the intent behind the legislation.

You can read the full CCPA origin story here.

“It was all based on the feeling that companies were not respecting the information that was being collected and how they were dealing with that information.”
–Rick Arney

The vision really came full circle when taking a look at the Attorney General’s statement regarding Sephora. The misstep boils down to three areas where Sephora (and other investigated businesses) fell short:

  1. Notify consumers of all the ways in which personal data is “sold.”
  2. Offer means of opting out per the CCPA Regulations.
  3. Honor received opt-out requests in their various forms, including through browser signals.

All are closely linked to the original intent that started the CCPA and CPRA process: let consumers know what is going on and give them control over their data

Why does the Sephora news matter to all of us?

“What we think is sensitive today could be different tomorrow.”
–Rick Arney 

What was clear from the webinar is that the fine is indicative of the changing privacy paradigm California businesses cannot ignore. The CCPA’s expansive definition of “sale”, the landmark of California’s regime, specifically includes the concept of anything of value. It’s evident that as we go deeper into the CPRA and (proposed) regulations, companies are expected to change their mindset around commonplace data-for-value exchanges. 

Mindset shifts discussed included: 

  • Getting into the habit of looking more closely at their third-party SaaS relationships – including the contracts. 
  • Coming to grips with the volume, velocity, and variety of personal data popular analytics and targeted ad providers can collect.
  • Noting how the information we think of as sensitive or not sensitive can change depending on context — so you have to think in a pro-privacy way. 
  • That California policymakers are intentionally looking beyond traditional data brokering to address how data is monetized across a digital economy dependent on monitoring consumers behind the scenes.

The kinds of data that was shared by the retailer with tracking providers was an important ingredient in the case. One example that Cathy and Rick highlighted was how the California Attorney General specifically called out data points that could infer a woman’s geolocation and health condition – information that is both sensitive and potentially perilous in light of the recent Roe v. Wade reversal.  

“I thought it interesting as a woman that they mentioned the geolocation of women and prenatal vitamins in the AG’s press release… This ties to how much things change, and public opinion changes on these things.”
–Cathy Polinsky 

A changing awareness of unexpected data flows is coupled with changing legislative definitions in California. Daniel Barber noted that “information that perhaps has been shared with another provider” does deem a “sale” as outlined by the CCPA. If history is any indicator, this is not something that companies can continue to sit on the fence about.

“We’ve been tracking this type of data around, you know, Do Not Sell requests and Right to Know requests since the beginning of the CCPA… About 63% of requests that we see are, in fact, for Do Not Sell, and that number is actually increasing.” –Daniel Barber 

Current events confirm this is a rising tide – the AG issued a stark call to all businesses that “the kid gloves are coming off” regarding noncompliance, while encouraging Californians to exercise their enforceable opt-out rights.

Should we be surprised by the Attorney General’s actions? 

Anecdotally over the last few days, many have remarked that they were surprised that the Attorney General’s Office went after a retailer and not a broker or Big (Ad)Tech giant. Topically, given much of the recent federal and state level efforts surrounding children’s online safety, many thought the first public action may center on that topic (and not tracking software). Our panelists were split about just how surprising the enforcement actually was. 

Rick believed that the first major enforcement may have centered around children, especially given the legislation that passed the California Assembly and is currently awaiting the Governor’s signature: the Age-Appropriate Design Code Act

However, having worked with eCommerce companies for many years, Cathy was not surprised. Retailers really want to focus on their specific products and business, so they lean into SaaS over many other industries to help them do that business. These include technologies supporting product recommendations, shopping cart abandonment emails, loyalty program management, and yes, lookalike audience modeling and personalized advertising, among other solutions.

The panelists then discussed managing privacy in “the cloud paradox.”

Managing the Cloud Paradox 

Cathy worked in eCommerce for many years and painted the picture of “the cloud paradox” nicely. Over time, and with technological advancements, the growing emergence of SaaS apps became a prominent part of businesses’ tech stacks. 

According to Okta, the average number of SaaS apps per customer remains at 88. Companies with 2,000 or more employees deploy an average of 175 apps per customer. With so many applications, it’s easy to emphasize with companies tasked with figuring out where personal information and sensitive data may be lurking. However, under the CPRA, it’s a task that companies hold dear and take seriously, lest follow in the footsteps of Sephora and others. Further, when managed correctly, data privacy can open up the doors of opportunity. 

It really is about starting with your data map – understanding where the data is flowing, understanding the technologies you are selecting. There are some solutions that use the data you share what you thought was a perfectly reasonable thing for your customers to help train other use cases.” –Cathy Polinsky

[Seeing data maps] becomes a powerful moment. Because a) [companies] can maybe better tailor things with clients, b) can embrace the culture of privacy, and get even closer to clients by respecting that information. It’s actually a good thing to do and is valuable, and also, guess what, keeps you out of trouble.” –Rick Arney

Advice for companies: 

  1. Make sure you know where data resides in your tech stack, whether it stays within your walls or goes outside to other parties. 
  2. If data goes outside your walls, understand the other party’s privacy and security practices. 
  3. If they use your customers’ data for their own commercial uses,such as to improve ad targeting for other clients, both parties need to comply with Do Not Sell obligations. 
  4. If you do not want the data you share to be used this way, you can contractually restrict such secondary uses. 

As Rick and Cathy discussed, the CCPA/CPRA – like the GDPR – creates a chain of privacy responsibility for businesses and their technology providers. 

Creating Data Flywheels of Trust

Privacy and business outcomes can be friends and not foes. In fact, when you’re handling privacy and security correctly, and the customer truly trusts your organization, then it can create something of a “data flywheel” of trust. A data flywheel is purpose-meets-minimization. It’s a pretty simple equation when (1) the customer knows the information you’re collecting, (2) they know why you’re collecting it (hint: to provide value), and (3) they know they have control over when they no longer want it to be in use. Then, you’ll frequently find happier customers that are willing to provide personal information — and their business — with your company. 

People love personalized experiences, and it’s often in the best interest for both the business and the consumer, but that trust must be maintained in order for it to work smoothly.

The settlement with Sephora and the AG’s parallel investigations underline the reason CPRA authors added the concept of “share” to the concept of “sale”. What businesses do and who they work with to achieve personalization is not always intuitive, understandable, or acceptable to consumers. With Do Not Share, Californians gave themselves an explicit right to opt-out of their data being shared across companies for personalized ads. From the settlement and adjacent casework, it is clear that the AG – like European data protection authorities – is increasing pressure for adtech reform, starting with those who use these technologies at scale.

“It’s about giving customers choice… I don’t think that there’s a conflict between personalized experiences and data privacy. But if you break that trust, you will never gain it back from your consumers. And so it’s very important to think about these things from the get-go to build them into your workflows and to be responsive to consumers’ requests.” –Cathy Polinsky

Stay informed on the latest data privacy news and privacy regulations and insights with our newsletter.