close
close
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
DataGrail Summit 2023
Where legal, security and privacy connect to plan the future of data privacy
Share:
Moderated Panel

Crossroads: Where Privacy & Compliance Meet

Ty Sbano CISO Vercel
Matt Hillary CISO Drata
Chris Deibler VP of Security DataGrail

Privacy. Compliance. Security. These terms are often used interchangeably. However, while similar, they aren’t the same. Hear from three industry-seasoned CISOs, Ty Sbano of Vercel, Matt Hillary of Drata, and Chris Deibler of DataGrail, about how they define each term, how and where they intersect to execute on privacy, and where legal fits into the mix.

View Transcript

Chris Deibler (00:13):

Hi, how's everyone doing? We have the comfy chairs apparently. What's everyone's energy level? We want to make this spicy or smooth?

audience (00:22):

Spicy.

Chris Deibler (00:23):

It's been my life's dream to stand in front of a crowd and everyone yells spicy, so thank you for helping me accomplish that. Notes. Okay. Hey, I'm Chris Deibler. I'm the VP of security at DataGrail and I'm here with Ty and Matt and we are here to change the topic entirely and talk about the intersection of security and privacy. And in this case of course, compliance, which is what makes this great.

(00:45):

I have worked in a lot of both high and low maturity organizations where compliance is concerned, and I'm going to save the anecdotes before we actually get into it. Before I was at DataGrail, I was running production security at Shopify. Before that, I brought a security program to Twitch after their both breach and acquisition by AWS, which is wildly different compliance environment. So I've got some fun stuff, but I'm actually not the person where the good stuff is going to come from. I would love for Ty and Matt to introduce themselves and tell us what you're doing and where you are and most importantly, what your business line is, so there's context around what we're about to lay out here. And Ty, please take it away.

Ty Sbano (01:27):

Perfect. Hello everyone. Ty Sbano. I've been doing this for about 18 or 20 years depending on how you look at it. Functionally, I started out in consulting and then I pivoted over to financial services, so JP Morgan Chase, Capital One Financial. I was there when all the mobile experiences started to emerge and everything went to a much different experience of how we interact with the web. I went to Target for about a year. I ran immediately away after 365 days, out here to the Bay Area to get back to FinTech and then eventually pivoted down to startups where I really wanted to be. So I joined a data analytics shop called Periscope Data. We were acquired a year into the journey. Whole bunch of fun stuff happened. You remember the pandemic.

(02:06):

And then most recently I joined this wonderful place called Vercel. It's really the intersection of the front end cloud, so how developers really deliver code in a fast way, and functionally the workflow that supports it. It's been really exciting taking all of my software security experience and security in general and building it as part of the business, because it's a seven year old business, but it really didn't get started till about two years ago since it's rooted in open source. With that, I'll pass it to the wonderful Matt Hillary.

Matt Hillary (02:33):

Thanks, Ty. I'm Matt Hillary. I'm the Vice President of Security and Chief Information Security Officer at Drata. I've been at Drata about five months now. It's been a very eventful five months. Prior to Drata, I've been the security leader at a number of companies, both big, small, public, private. Just a little about Drata. Drata is a security compliance automation platform, helps you automate all the painful parts of a compliance program and saves you a lot of time and it's helped enhance a lot of programs along the way.

Ty Sbano (03:00):

You know what's cool? I talked to his founders before they looked for him and then they found him and it's been cool to go back and say, "You found the right guy."

Chris Deibler (03:09):

All right, so the topic is the intersection and compliance is kind of the new element that we're bringing to everyone here today. We'll start with a softball, which is basically tell us how these different silos report into the executive stack within your organization. And then also why did it wind up that way? Was it organic? Was it a choice? Again, left to right.

Ty Sbano (03:32):

I'm happy to start this one. When I joined Vercel, ultimately privacy was just kind of floating. In my past shop in an early stage series B, it was also a floating narrative, but also GDPR was new, so I had some experience around the privacy domain. I think the intersection between security and privacy as well as trust really worked themselves into the same category of teams and resources that go have the conversations with engineers. So within my world, it actually started with, I joined. There was a head of legal, that person said, "I would like privacy." And I said, "Hello, I'm our CISO. I have no idea what any of this looks like yet, but I'm going to dig in." I already knew my security program was going to be a lot of work. So I said, "By all means have privacy."

(04:14):

Well, 90 days after they said that they were no longer employed there, and I said, welcome back privacy. Let's do this together. And since then I've hired two great people, Kacee Taylor, Sabrina Benassaya, as part of our team to really next level how we think about privacy first analytics, privacy first front end clouds. And we actually did an acquisition of a company called Splitbee, which helps from the engineering side as well. So not only is it that we are privacy champs in our team, we actually have engineers that care about it fundamentally, and we pivoted our business based on a business continuity and crisis management tabletop talking about what if Google analytics becomes illegal in Europe? And well, we were ahead of that curve and we're really happy that we brought those folks in.

(04:54):

And it allows our team to fundamentally, again, interact with engineers as things are being designed, as analytics are being created from data sets that we're like, are we comfortable surfacing this for our customers? And if so, how does that actually work? So it ends up becoming a real paradigm shift for the organization as we went from more just a front end cloud developer experience to now, how does privacy fit into the narrative? So it's been extremely, extremely exciting.

Chris Deibler (05:20):

So you loved it and you let it go and it came back to you?

Ty Sbano (05:24):

It was meant to be.

Chris Deibler (05:27):

Matt.

Matt Hillary (05:28):

No, it's been organic at every company I've been at. I think Brandon joked earlier about, hey, we're going to do rock paper scissors with the chief legal counselor and say who's got privacy? I think now it's morphed into a more mature conversation to say, hey, privacy is important for us, it's important for our customers. We want to make sure it's owned wholly. And so some organizations, the CLO has had phenomenal privacy experience and they've wanting to be our DPO. Here at Drata, when I joined recently, sat down with our VP of legal and said, hey, same conversation. There's this organic marriage between legal and security in general, one that you want to have a very strong and just competent relationship with them. And so in some cases, like you said, it's bounced back and forth.

(06:05):

Right now I do act as our DPO. That was a very intentional conversation, but it was one that needed to be had to say, hey, what makes sense right now? I do see that organically changing over time depending on what's happening, but right now we have actually two members of our compliance team, they actually own managed privacy for us and really care about our customers.

Ty Sbano (06:25):

Yeah, one interesting aspect of where we're at in Vercel's journey, we had that legal person I mentioned, since then it's scared some people of introducing a legal entity internally. So we don't actually have legal counsel inside. We have external counsel, we have trusted partners that we really appreciate. Luckily Kacee has a JD and we just go point to her and say she's our lawyer for the day. But in reality, it changes how you interact with legal. Because it's someone from the outside in, we actually have to play to that business narrative as well. Because they're not going to fundamentally understand the workflow. They're not going to understand the threat profile, and so we bring those narratives to the conversation, but at the same time we have to balance out and say, well, I'm going to wear my lawyer hat and make sure we have this early discussion because they're not in the room and I have to represent that as well.

Chris Deibler (07:11):

Speaking specifically to the compliance piece of it though, at the end of the day, if you bounce on the compliance regime, does that go accountable to you? Do you also own those?

Ty Sbano (07:21):

Yes. I think for me it's security, compliance and privacy, IT and trust and safety. And then I just call special projects based on whatever's happening with the company and where we lack resources, or say there's a leadership change and there's just a need for someone to pick something up to usher in that next part of that project or that wave. I think that's an aspect that CISOs often, I don't know if it's a burden or a tax or we just like the pain, I think it's a little bit of the latter, but we look at risk. And when we look at risk across the organizations, I think really good CISOs end up just going in and helping, whatever that may look like. And talking to Chris before this, that often ends up becoming, well, we don't have this VP of insert title, I'm happy to take on the project. I'm happy to usher this. Hopefully we get someone in the future. I don't know if I want this long-term. And what's interesting is, if you have the right leadership team, they will understand those interim functions to get to the other side.

Chris Deibler (08:16):

I want to riff on that, but I want to hear from Matt. Do you own the compliance successes and failures as well?

Matt Hillary (08:21):

Yes, absolutely. Just like Ty explained, I look at those same directions. We have IT, we have compliance and privacy, we have security as well. There is significant overlap. At one company, we actually had legal owned both compliance and privacy. Now, security compliance was a small bucket, but the same thing that you were talking about, Ty, which is Hey, this seems unknown with security compliance, who owns it? Let's put that under the CISO now, but then compliance and legal fit under the legal team.

Ty Sbano (08:45):

But it's probably easy, right? I mean, Drata on Drata.

Matt Hillary (08:47):

Yeah.

Ty Sbano (08:48):

Compliance is done.

Matt Hillary (08:49):

That's right.

Chris Deibler (08:51):

I've referred to security organizations in numerous past lives as the silo of care of last resorts or the place where broken toys go. If there is a risk to an organization for the very large capital R definition of risk, and that risk is not owned, security people can be expected to just pick that up because someone certainly has to care about that. And we should talk on that more. Before we get to that-

Matt Hillary (09:21):

You make a good point there, Chris.

Chris Deibler (09:21):

Oh, no. Go ahead.

Matt Hillary (09:21):

Which is just about we speak risk. It's our core language in security folks. And so it makes a lot of sense to bring privacy into the thing. Hey, what is the risk of having this data? I think we talk about the difference between security and privacy a lot in the sense of security people say, hey, give me all of your data. We'll protect it. And if it happens to trip some of these compliance frameworks out there for having this type of data, then we'll protect it commensurate to that compliance framework.

(09:44):

Now, privacy is very intentional. It adds a layer of transparency, it adds a layer of added scrutiny and life cycle to that data, where someone comes in and says, hey, cool. From a privacy standpoint, these are the data elements we will collect from you. This is how we're going to process that. And if we're going to change that, we'll let you know. And this is how long we plan to keep it. And if you want it gone, we will get rid of it and give you assurances that's happened.

(10:08):

And in that lifecycle, people can ask, hey, what do you have? And so it adds that extra layer of complexity. But honestly, when you talk about building trust with other people, that level of transparency I think helps establish and maintain that.

Chris Deibler (10:20):

Is there a conflict in your head between the security response of, please encrypt this, versus the privacy response, which is please stop collecting this?

Matt Hillary (10:30):

It's almost like the same question as, hey, does security equal compliance? Does compliance equal privacy and does the privacy equal security? That whole spectrum. I think they're all intertwined in an ecosystem, personally, and they really do, if you follow the essence of all of those intents, they all win together, in my thought. I don't know. Ty, what are your thoughts?

Ty Sbano (10:48):

I'm very well aligned. I think there's the concept of the Goldilocks principle of what is the data that's just right for the organization and what did we disclose as part of our privacy policy or magically did we ingest a new workload that has all this sense of information? But if you're with the proper data handling practices, I think encryption and storage and movement of data, you're still going to represent that as a CISO or a security practitioner, anyone interacting with customers to represent those controls. So I don't actually have that mental model of fighting with myself. For me, it's more just the conversation of here is the risk.

(11:22):

And I think if you're with the right team, it's again, it's not who owns the hot potato, because I think the risk owner concept, everyone owns it and then no one owns it. That's the not great experience, but in reality, you need an assigned owner, but you are the usher, you are the janitor sometimes, so I use that joke a lot inside of my own house. CISOs, we are janitors. The party's happening, guess who has clean up? We do. And that's okay because that's part of our role to make sure that we're cleaning up our tracks, following the project plan as we said we were going to do, and honoring probably an enterprise contract that has a specific red line that you remember, but there's no contract management system that helps you remember it.

Matt Hillary (11:58):

When you said janitor, it made me laugh a little bit, because I had someone tell me, "Hey, a janitor in the company that's a board level position, there's so much stuff to clean up after."

Ty Sbano (12:06):

I know I always wanted to be a janitor.

Chris Deibler (12:07):

I promise this is the only audience participation in the entire presentation. If you find yourself in the same situation as the owner of all the cares or at least all the keywords up on the big board like Matt and Ty, I'd love to see a show of hands. Do we have lots of... I see at least one solo practitioner, two. There are some people in the room that share the same responsibility, accountability.

(12:33):

I think given the time, we're probably going to get the two or three of these, but here is my favorite because we talked about compliance regimes earlier. As a security leader, working in a security organization solving security problems, you've probably been doing compliance work vis-a-vis audits or other regimes for quite some time. And the industry has had 10, 20, arguably 30 years of muscle memory and experience filling out PCI, self-assessment questionnaires and sitting for FedRAMP and pick your compliance regime. Going public dealing with GDPR or GDPR, GLBA and SOX. We don't really have privacy audit regimes per se. Based on the trajectory of how everything's moving, that has to be coming. What have we learned from processing compliance regimes and audit regimes on the security side that we can adopt immediately to not make our lives hell when the privacy audit side of this becomes a real thing?

Ty Sbano (13:35):

I think for me, privacy has always been part of the narrative. As someone that built software security programs to scale, there's companies, I mentioned JP Morgan Chase, Capital One, Target, first step, what is an application? What are our assets? Where is our inventory at? And then functionally, what's that data associated with those apps? And now we're in a more modern place. I think there's technical monitoring like eBPF monitoring inside of a cloud environment where you can see what data's flowing versus push me back seven years ago. I'm like, I'm just going to trust people on a certain scale. And then based on that trust and how often they're patching, how often they're remediating, how often they answer or how many incidents have they caused, the trust keeps moving up and down in alignment with the business. So that's where the emergence of business information security officers or BISOs or VISOs or business CISOs, whatever you wanted to call them, emerged.

(14:24):

For me, having that narrative and that inventory actually allowed a lot of interaction with many other teams. So let it be PCI let it be Graham-Leach-Bliley Act, let it be SOX. Having that asset inventory was always the key to success in my book. Because then I could have a very structured conversation of, well, where's our PCI data? And in a banking environment, it's everywhere. In reality of what that compliance program looks like, I'm going to be honest, it's laughable, because no one is functionally protecting all elements of data leaking everywhere. But you're talking about that customer database, you have a modeled map, you have an architecture flow, and you have your story together. But I think the reality is a little different when the rubber meets the road.

Matt Hillary (15:04):

For me, I think the challenge has been, I've always liked to take the compliance frameworks out there, some of them are pretty complex. When you think of FedRAMP, when you have a control that's a paragraph long, you want to distill that out into actionable Boolean checks. And that's where Drata's impetus came from and say, hey, how can we automate this? How can we check this on an automated way? From the compliance frameworks, even then distilling it down to, hey, amazing staff engineer team member, this is what this control means. You do this, this, this, this. And they're like, awesome. That makes it easy for me to build to, that makes it easy for me to implement.

(15:35):

I think the challenge with privacy acts or regulations is it's still in that you do these things, framework, and it hasn't been distilled out into specific controls minus, I could be wrong here, but HITRUST is an example where they've taken HIPAA, they've actually distilled it out into a control framework. They've created a whole attestation capability for it. But some of the other ones out there, it's hard to take and actually say many individuals or companies like Drata are actually taking that and trying to distill it out into controls that you can then monitor. But I think we still struggle into what can this be interpreted to and how do I comply with that? And so that's something we can probably learn from.

Chris Deibler (16:07):

The binary thing is really interesting. Because I derive a ton of value from your product doing that, product pitch, but coming into a new organization and doing the triage and saying, what's going on here? You don't think in terms of binary terms. You don't say, am I meeting my data handling standards or not? The question is actually, is this sufficient? And so you find yourself wandering to things like CMMI or various other models of maturity. At the end of the day, some compliance regimes are, did you do the thing? And some compliance regimes are, is this thing sufficient? When we move into the privacy space, which of those two worlds are we going to live in?

Ty Sbano (16:49):

What are the ramifications? I think when I look back at the scenarios of conversing with, let it be marketing leaders, let it be your business leader, what are the ramifications of a privacy violation? At the stage of a startup I think it's vastly different than if we're at Facebook, if we're at a manga company and all of a sudden we're getting sued the second the new privacy regulation ships. That's just how it works.

(17:14):

Versus the rest of us, it has to, as stated in the prior conversation, how many tweets or mean tweets are going to appear versus how many complaints actually occur. And I think taking some of those serious, it's been a little bit harder in the startup realm, so the ramifications aren't as clear. So guiding the company to the right decision or the right decision at that moment, is a balance of what is it going to actually be? Because if we got fined and we defended our program and said, we had all these controls, we did our own self-audits, but what does it mean to be GDPR compliant? Who do we prove that to? Oh wait, we have privacy shield. How'd that go?

Matt Hillary (17:53):

From my end, I was just thinking through just how difficult it is sometimes to really implement these programs. Like I mentioned before, distilling it out to, hey, what do these people need to actually do internally to make it work? That's a struggle.

(18:05):

We talked earlier about legal and security, and I don't know if Chris, you wanted to change directions here, but I was thinking about how in the past I've seen this awesome effect where our legal team internally didn't have the web hooks per se into the engineering team members or the product team members that the security team already has. If you have a strong security software development life cycle, generally you're building in security there. It was really cool several years ago when GDPR came out and they started talking about the just privacy by design in general.

(18:40):

It was an amazing pairing value to say I was at Instructure at the time. Our DPO was on a legal team. He's like, "Matt, I don't have these connections with our engineering team members that you do, and you're already talking about a design piece. You're already talking about how we want to push this to the thing." And so that was a really cool effect to say, "Cool, jump on board. Let's work through this together." So again, that whole distilling what these laws and aspects mean into what people need to actually tactically do, that's really where I think the art happens.

Chris Deibler (19:09):

Something that a previous presentation talked about was, I believe it was lexicon was the word it was lexicon. I like using taxonomy, but I think in this context it's the same thing, which is that both privacy regimes, security regimes, and by extension compliance regimes care about data. And in order to care about data and do the right thing with data, it must be categorized somehow And then the right things have to happen to it afterwards. But actually it's the categorization, the taxonomy, the lexicon that seems to be so much more of an art than a science. And to wit of the enterprise organizations that I've worked in, none of them have ever had even approaching similar data classification schemes.

(19:57):

I'm not going to speak ill of any former employers, but I've worked in some that are just downright bureaucratic and draconian. And yet that is the thing that informs all of the downstream actions that you take either from a privacy or from a security standpoint. Why is this an art not a science? Why is data classification hard? Why do we not have a standard by which we classify data and then do the good stuff?

Matt Hillary (20:23):

I think in some areas we do. If you have a social security number, it's pretty Boolean saying, hey, this maps something that matches something's identity. Or if you have a credit card number, generally it's like, wow, okay, this is cardholder information. I do have a spectrum of between soft core and hardcore PII, if that makes sense. And that's where I think the art makes it a little more difficult to say, hey, where does this fit and how we want to protect things.

(20:46):

I was talking to our team this last week about vendor risk reviews and we're thinking through, man, we want to have our level of risk review commensurate with the level of data they'll have access to, and it all stems around how you're classifying that data. So it really is a problem to say, hey, do I classify this as one of these areas of things that we're going to be handling or they'll have access to? Or is it one of these areas of things. We do want to get more quantitative in that space, so you bring up a really good question there. Right now it's still some of our risk assessments are very qualitative in the sense of saying what makes sense here? I don't know, Ty, that's been your experience, but for me, that's been a challenge sometimes.

Ty Sbano (21:18):

I think it's a struggle. I think relying on RegEx still to this day as our classifier and our engine doesn't functionally scale. And I look back at my history of talking to PCI consultants and program managers that literally go and have the conversation of, hey, how difficult would it be to take this string of numbers, account number, and make it this new transition transposed alpha numeric string? And I go, do you understand how complex that is in a banking environment? And I go, well, why isn't this easy? They can call the API and I can run it back. And I'm like, you're talking middleware and you're talking all the way back into these databases that haven't been touched that way. And so when you start to change the data, as a security team because you want to protect it more effectively, I think you step on your own toes. And that's the hard part of to what extent.

(22:10):

The one thing I actually struggle with, and maybe this is my spiciest take, is IP address comes up a lot for me in our business. And everyone's like, well, how can you have our IP address? And I go, how are we going to serve your customers any content if we don't have IP address?

Chris Deibler (22:25):

You opened a socket. I can see the socket.

Ty Sbano (22:27):

But you can't have IP address in the US. And I go, okay, so what happens if you fake an IP address or you spoof an IP address? Is that still protected? And so when we go down these paths of what is protected or not protected or hardcore or not, I think that concept is really tough when you look at freeform fields. If I look at financial services, again, it's great to wire money, but then you get into this memo field and it's like anything can go in there and tell me you haven't jammed a bunch of stuff when you're wiring money, because this is the down payment on the house. I want to make sure it gets to the right place. But then on the backend, you got people like Chris going, but that's not what the field is for. So why would the data go in there? And so I think we end up in this struggle of how people interact with systems and the intent.

Matt Hillary (23:15):

That was a challenge for me, Ty, at a company where we were like, well, one customer said, well, you need to be HIPAA compliant. You're like, wait, wait. Well first of all, let me comply with HIPAA. That'd be probably the first way of saying rearranging the words there. But you comply with HIPAA, you're like, well, you have a form here that anybody could put anything in there. How do you know that they're not putting data that would subject you to having to comply with these various frameworks or standards or regulations? And you're like we can, it's in our terms of service. And so I think that's a challenge as well right now.

Chris Deibler (23:38):

Well, it gets back to the automated data discovery and the failure of RegEx. My spicy anecdote here is using our log aggregation platform, and after an incident discovering that something that shouldn't have been committed to logs was committed to logs. Saying, okay, we got to go find every instance of that that has ever happened and do an exposure check and deal with it. And so we turned on that functionality in our log aggregation vendor and they proceeded to tell me that I have somewhere in the neighborhood of four to 5,000 AWS access keys sitting in my logs, which we then began statistically spot checking. None of which were AWS access keys, but all of which had the string AWS at the front of it because that was the cloud provider we were talking about, and then the count of digits matched, which is not a secrets detection regime.

Matt Hillary (24:28):

Who hasn't had an incident where they found data in the logs that shouldn't have been in logs. It's one that I think has hit many of us at many times in our career. It's painful. Either let it age out or you sanitize?

Ty Sbano (24:39):

I don't know. I hope to get to a point of maturity where we can care about that. I've been in those shops before under CISOs, and I'm like, is this what we're spending our time on, cleaning up insert name of tool? And it's just not a good use of time because things that are going to leak to logs, who has access to said logs? Engineers on time bound or specific on-call based access. So I think we have to really talk through what is the threat and what is the time that we're going to spend time on? And I think this year of all years, us in this room have to justify our jobs more than ever about why we're doing what we do and how we do it.

Chris Deibler (25:13):

What you've done by saying that, which for the record I agree with, is you have identified a Maslow's pyramid of security, privacy, and compliance care, that if we have not fed everyone, we don't care what's happening at the tip of the pyramid, or at least cannot expend the cycles to care about what's happening at the tip of the pyramid.

Ty Sbano (25:31):

Well, I mean, if the business isn't making money, how are you going to get a paycheck? And I think that's a really tough conversation. But if you're in the security, privacy, compliance or any governance and administration function that's not building features, products, things that people pay you money, I think you have to be very humble and understanding that since we're not the builders of that, we're the support system, we have to take some level of risk.

(25:55):

When I reflect back to the emergence of what is now the internet and how everyone carries the most powerful computer in their pocket than when they were born, it is crazy to think that we are still fighting the battle in our industry that people are trying to secure it so much that no one's ever going to use it. And in reality, when we talk about engineering, most engineers ship in mind, they are not going to be proud of what it is for version one. That is MVP, but they got to know people are going to use it, then they make the decision, then they make the next step. And I think that's something we, across this room, have to get more comfortable with. And it's, trust me, I know it's uncomfortable, but you got to get comfortable with being uncomfortable.

Chris Deibler (26:36):

That's security, that TXT. Last thoughts, Matt, before we ask for questions?

Matt Hillary (26:40):

Just on that pyramid idea. I mean there is a slider bar on the side of it of gross negligence and hey, it's okay. I mean if you find some stuff in your log that's pretty grossly negligent and you're like, yeah, we probably should get that out of there today or apply additional mitigating mechanisms like access, that you can access it while it ages out.

Chris Deibler (26:54):

The limit between frugality and career limiting.

Matt Hillary (26:58):

Yes.

Ty Sbano (26:59):

That's the science I think we have to do within our companies, within security, privacy, legal is, what is that balance? And to me, that's risk management. Why did I get up today and come to this conference? Why did I eat that food? Why did I cross the street? There's so many risks that we could talk about and be paralyzed versus, let's enable the business to try and change the world. And if you believe in what your company is doing, why wouldn't you try to help them in a creative way?

Chris Deibler (27:25):

So this wound up being a bit more of a therapy session than I had planned, which is wonderful. I'm actually feeling much better about myself. We've got two minutes, 30. Questions from the crowd before we move on?

(27:38):

I think we've cleared it. Thank you everyone.

Ty Sbano (27:42):

Thank you.

expand_more Show All

Explore More Sessions

Keynote

The New Frontier: Implications of an AI World

Alex Stamos
Watch Now
Moderated Panel

Privacy 2.0: Uncovering What’s Next for Data Privacy

Anna Westfelt, Rick Arney, Ryan O’Leary
Watch Now
Moderated Panel

The Flywheel of Trust: Personalization + Privacy

Cathy Polinsky, Jess Hertz, Julie Bornstein, Trishla Ostwal
Watch Now

Learn more about how DataGrail
can help your privacy program.

Our platform eliminates complicated, manual, and time-consuming privacy program management processes. We have 2,000+ integrations with the most popular business systems to help companies build comprehensive, automated privacy programs effortlessly.

close
Please complete the form to access all
on-demand recordings for this event.