close
close
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
DataGrail Summit 2023
Where legal, security and privacy connect to plan the future of data privacy
Share:
Presentation

He Said, She Said: Making Privacy Work

Jonathan Bollozos Chief Privacy Officer Thomson Reuters
Alyssa Harvey Dawson CLO HubSpot
Kevin Paige CISO Uptycs

Speaking security is a special skill, but what about legal language? Legal departments often handle all things regulatory, including privacy, but a successful privacy program requires collaboration across security, legal, and privacy. Join to learn from experienced legal and security leaders about how they approach partnering with other departments to implement effective privacy programs.

View Transcript

JJ Bollozos (00:12):
Welcome to He Said, She Said, where we pit a top lawyer versus a CISO in a battle royale. But seriously, I think this will be a good conversation. It really just kind of helps double click on some of the stuff we heard today. I'll let you all get to intros in a second, but I'm your moderator, JJ Bollozos. I am an attorney, so I have to start with a disclaimer. We are all here expressing our own personal opinions, and our opinions don't necessarily reflect the views of our own respective companies. So with that out of the way, Alyssa, I'll let you introduce yourself.

Alyssa Harvey Dawson (00:51):
Hello, I am Alyssa Harvey Dawson. I'm the Chief legal officer of HubSpot. I am in charge of legal and compliance. We're going to get to it, but that does include privacy. And prior to that I was a Chief Legal Officer at a company called Gusto. Privacy, was also on that team under Legal. I've also been at an offshoot of not Google, Alphabet, called Sidewalk Labs, where I was also the head of data governance and, just, I've been in tech for a while. I'll just put it that way.

Kevin Paige (01:29):
Hi, I'm Kevin Beige. I've been in security probably closing on 30 years at this point. So kind of got my start in the military in the Air Force. Worked with the government civil service employee for the Army, and then worked a lot of public and private sector jobs, SalesForce and MuleSoft and Flexport. And now I'm at a company called Uptick. So been doing security for a long time in lots of different sectors and happy to be here today.

JJ Bollozos (02:02):
And Kevin Upticks is a cybersecurity startup, right?

Kevin Paige (02:05):
That's correct. Right. It's a cybersecurity startup that's focused on securing your endpoint to the cloud. So that's the focus.

JJ Bollozos (02:15):
And for those that don't know HubSpot, CRM platform.

Alyssa Harvey Dawson (02:18):
Yes.

JJ Bollozos (02:19):
And My First Million is one of my favorite podcasts, which is in y'all's network.

Alyssa Harvey Dawson (02:24):
We just had our inbound last week, so it was great.

JJ Bollozos (02:27):
There you go. Perfect. Well, let's start by getting really personal. I'll start with you, Alyssa. What is your view of privacy? Is it just a legal right as a lawyer? Is it an ethical issue or is it even something higher like a fundamental right for individuals?

Alyssa Harvey Dawson (02:47):
So we're not getting into Supreme Court stuff because let's move away from that. I think it's all, it's all of the above. I think there are obviously some legal underpinnings that are guiding it, and we certainly have enough laws and rules around the world that are sort of validating that. But I also think it's fundamentally about us. Put the legal stuff aside. It's about our ability to have control over our information, our data, the stuff about us or your companies. And so to me it's a fundamental way of working and recognizing that those who create the data and information really should be the ones who have the control over it. And what do we do when we interact with that at companies? We should be stewards of it. That's my personal opinion. We should be stewards of that and making sure that we're doing the right thing by it. That if people are coming to you for a particular purpose and it involves their data and information that you're using, the data information for that purpose. And so that's kind of how I see it.

JJ Bollozos (03:49):
Kevin. Agree?

Kevin Paige (03:53):
Yeah, I mean, I agree. I probably look at it a bit more specifically. I look at it as a data and data security problem. And the way we look at our data and the way we look at our data security, I think is fundamental to privacy. I look at privacy as data elements. I probably get maybe a little too technical, maybe a little bit too engineering view of it. But at the end of the day, are you doing the right thing to protect information, whether it's a social security number, whether it's PHI, whether it's whatever there's ethical means and there's legal means and there's just the right thing to do, and making sure that people understand what that right thing to do is because that's a pretty vague statement. And that's where we use laws and compliance to help us kind of decide what level of effort do we have to put into solving it. Because every level of effort costs money as well to be able to get there. But definitely ethics and very strong data security issues. Right.

JJ Bollozos (04:52):
Now we heard earlier that privacy needs to be embedded a hundred percent across every function within the company, but who really should own privacy? Kevin, maybe I'll start with you.

Kevin Paige (05:04):
Sure. I think ownership of privacy is tough because I think the definition of it is tough. And maybe that's why we've created these new roles called Chief Privacy Officers to kind of be that liaison in between. But I think that ownership is tough because it depends on where it goes.

(05:25):
But how I look at it from a security perspective is when you build a security program, all security programs are based on legal obligations, regulatory obligations, contractual obligations and risk obligations. And if you follow that framework, privacy is laws and regulations and contractual obligations that you need to follow in order to keep your data safe.

(05:46):
So from my perspective, I see it as I would see a lot of places that a lot of whoever's responsible for information security and data security and getting input from Legal to make sure that things are being done correctly. I see that usually works really well because context is usually with your security and your technical team where the legal, making sure you're doing the right thing with the legal team. If you can get that good friction, I call it, you want that good friction in order to make sure that you're secure. And I think if you can get that good friction, then that leads to good outcomes and potentially good ownership.

JJ Bollozos (06:24):
So at Thomson Reuters, our privacy offices sits within the legal department. Alyssa, do you think it belongs in the legal department?

Alyssa Harvey Dawson (06:32):
I think that there are different components to it. It kind of depends on how you define it. So earlier today, the panelists talked about all of the different elements for which data and data oversight intersects. Privacy laws, regulations, rules.

(06:54):
As we also said though, there's a huge technical component to this, right? It does not work if you're just focusing on Ts and Cs alone. And in fact, I think you can have really bad experiences for your customers if that's all you do. We've all... Well, no one's read the privacy policies, but do you know what I mean? That is not something we want to have done. So privacy by design to me caused that natural intersection. You understand what the rules are and then you intersect that with then what are you going to do about it? How do you govern that? How do you design around that? So you've got to have them working closely together.

(07:28):
And then as you said, security is overlaying all of it. It's a triad if you think about it from that perspective. And each one has a specific role and purpose and I think alone, don't work, together, it's nirvana and certainly for our customers.

JJ Bollozos (07:45):
Okay. You talk about defining the rules of the road contracts, legal obligations. The one thing everything seems to have in common is they want appropriate security controls. What the hell does that mean? Kevin, maybe I'll go with you first.

Kevin Paige (08:06):
Yeah, I mean I guess — “appropriate” — it's a very vague term and you can take it any way you want, but I think some of it depends on the company. Some companies, customer trust is the top value of the company. So some of this stuff has to feed into the company and the values of the company and why people do business with that company. And I think that when companies focus on trust, they're going to want to do the right things and those right things are probably very tight things.

(08:37):
I got a quick story I can tell where I worked at a company where trust was a top top value, big data breach, no auditing on where the data was stored, no authentication on where the data was stored, no possible way to know what data was breached. However, we knew all the customers that we had. So we worked with Legal and say, Hey, Legal, let's read all the contracts of every single customer and figure out what the minimum thing is we have to do. And then we're like, Hey, we're going to get a list. And then they made every security executive at the company sit on a call with Legal and talk to every single customer in that organization to explain to them what happened, how it happened, and what we're going to do to not make it happen again.

(09:21):
So I mean, there's an idea of that's way beyond the minimum. And that worked out great with some customers and horrible with others, but there was not going to be a great way to do it. But that was the way the company handled it and the company's number one value was trust, and that was how they presented in dealing with that control of that major data breach.

JJ Bollozos (09:40):
But Alyssa, if you're responsible for ensuring that the company's complying with the law and the law requires appropriate security controls, how do you make sure that what Kevin's doing actually rises to that level?

Alyssa Harvey Dawson (09:56):
I'm not a believer in just do the bare minimum, and I know it's costly. So if you're at small companies, I get it. I get how that is a paramount thing because trying to make money, you're trying to be profitable and so on and so forth. And quite honestly, we all are.

(10:11):
If you think about it from your customer's perspective, you want them to trust you. You want to be transparent and you want to be accountable. And I think if you go from that perspective, you're going to end up on a bar that is, quite honestly, it's kind of be above the minimum because to get to that level of trust, transparency, and accountability, there are more things that you will need to do than maybe what X, Y, and Z regulation or best practice might say you have to do. So I think you've always got to approach it with what are you trying to achieve and what do you care about for the person or entity that you're trying to solve for.

(10:52):
And at my companies, I've been fortunate, I feel like it's always been about making sure that the customer had high confidence and trust in what it is that we were doing. And that always raised the bar. And so when you do that, I'm sure I'm going to be able to tick off all the legal stuff because we're already operating at a higher level.

JJ Bollozos (11:11):
Fair enough. You mentioned cost. And from my personal experience, I don't own Security’s budget, so it's easy for me to say, Hey, you need this cutting edge encryption or you need these types of controls on your product, but he's the one who has to pay for it. So how do you have those conversations and how do you come to an agreement over the actual cost and the implementation?

Alyssa Harvey Dawson (11:36):
Yeah, I mean, so depending on where you are in your sort of corporate journey, there are other factors at play. There's a company budget. We're all about trying to figure out and operate within a bigger picture. And so when you break down what are the things that are needed for you to honor whatever your commitments are, you're going to analyze it then with respect to like, okay, what does that mean for your security team if you want to do those things and it costs X, Y, and Z, I think it's just, I don't want to make it seem so simple because it's not, budgets are hard, but it really becomes a matter of that's the line item. That's what you need to do to accomplish that goal if it's important to you. And I think that's kind of where you can all see the rubber hitting the road. If it's not important to you, then you're going to see sometimes those budgets not getting where they need to be, but you've kind of then made a statement.

(12:30):
If you are a company, I don't know if everyone knows the cyber rules that just passed the SEC, there's this sort of another factor that's now going to come into play for public companies and that your board cares about, and I think that's also going to raise the bar. I don't know if CISOs are like, yeah, finally someone's putting a spotlight on it and letting folks know that this matters and it's not just going to come free. It's going to come with attention, money, teams, resources. And so I think that is also something that now has sort of given a spotlight to how important it is for companies to operate in this weather and at a different level.

JJ Bollozos (13:08):
And Kevin, how much can you leverage what your lawyers tell you and what the law requires to help you get that budget?

Kevin Paige (13:15):
Yeah, it's a partnership, right? Because if whatever we're doing, we're like, Hey, I've got a person working 25% on this. We're falling behind. We can't respond to do the subject request, or we can't respond to something. And then I talk with my CLO or my GC and I was like, Hey, there's this pretty cool tool out there we can use to help automate some of this work. I don't have any budget for it. Can we go to the exec team and can you help get my back on this when I identify the problem and I propose the solution and then I ask for some budget? Can you be my cheerleader? And I think that's where we work together because I want to be more secure and transparent and Legal wants to make sure that we're meeting the rules of the law and we got to let the business know how to do it. And [it’s] working together that gets us to a good spot.

Alyssa Harvey Dawson (14:09):
Yeah, it's definitely an allyship and a partnership when you're doing budget season and planning not only are you talking to Legal, you're going around the organization to get support for the needs. And if you come in there in lockstep, it's clearly going to rise to a higher level than if you were to try to do something alone without all of the other support.

JJ Bollozos (14:29):
Right. And circling back to customer trust and collaborating together, you guys talked about how important customer trust is. Well, some of our customers are lawyers who are the worst negotiators in the world, almost as bad as-

Alyssa Harvey Dawson (14:45):
You mean the best negotiators.

JJ Bollozos (14:46):
Yes. Almost as bad as my three-year-old daughter who is just stubborn as can be. But how do you educate your customers on why your controls are appropriate and acceptable as opposed to some bespoke controls that they want you to implement for their specific contract?

Alyssa Harvey Dawson (15:06):
I think that that's an interesting thing. It goes back to being as transparent and clear as you can with your customers about what it is that you're doing and why it is, how it works, how it is protecting their data information, how it is enabling if the data has to go into different places, how it's enabling that service that they're looking for. You have to step way beyond the legal terms and get to the core of why it matters to their business. And I think if you are talking in business terms and plain English terms, then you'll increase and enhance that understanding.

Kevin Paige (15:51):
And I'll just tie on there, is that over the years I think that we've learned that we get every company's different. Every company's going to ask different questions. So, hey, why don't we build a data security addendum as well? The most common questions, why don't we get them in there, build them as an addendum into our MSA if you're a product company, and then that way everybody knows the bare minimum that's in place and they can ask questions based on that. And it kind of makes people feel good first that like, okay, you're taking security seriously. You actually have a data security addendum and that helps the conversation or it eliminates the conversation a lot of times because maybe you have a minimum bar and this far exceeds it, or maybe this gives you a spot to start from so you're not starting from scratch to make the conversation easier.

Alyssa Harvey Dawson (16:36):
Make sure your salespeople know what it says, because they're your first line. If they understand it and get it, then they'll do most of the well selling for you and only those things that are unique that need to sort of escalate and get there, but they're your first line.

JJ Bollozos (16:53):
Yeah, I was just telling Matt from DataGrail earlier that he asked me what is the most important thing I think Sales needs to know? And I said, they need to understand why our controls are acceptable. Because if they're just going to say yes, we'll accept whatever controls you want Mr. Customer, then we've already lost the battle.

(17:11):
Perfect. All right, so maybe let's talk about more of that good friction, as you said, Kevin. Alex put up a tweet about White Castle, belly bombers as they're affectionately known about a $17 billion fine for collecting biometric data. I'm guessing that was a potential BIPA class action. Now that kind of goes to that friction where if you want to enhance security, you need all that data. You want to know exactly who somebody is, what they're doing at all times, which is inherently against privacy controls. So Kevin, how do you, especially as a role of both security and privacy, how do you balance that?

Kevin Paige (17:56):
It's a tough balance because I think that a lot of times that from a law regulation perspective, they're really designed a lot of times in a perfect world to keep honest people honest. And unfortunately, I live in a world where I try to keep honest people safe from unhonest people. So a lot of times being able to understand why certain data elements should be kept for fraud reasons and for a lot of contextual reasons that weren't really thought out when they built a law or the regulation takes a lot of conversation. And it's tough conversations because when you're talking with some lawyers that maybe don't have an IP background or a privacy background, they're like, well, I don't really care because the law says this. And you're like, well, you got to care because the business is equally important, and guess what you're a lawyer, so you're going to have to defend this for us because we can't just do this even though the law says it.

(18:45):
And get into that spot where you can have that level of friction and figure out how you can deal with it, I think is tough. And I personally think it's the job of the CISO to do that, to make sure that there's friction there and make sure that we're protecting both privacy and legal as well as the business. You have to balance the risk. You can't be too far the risk one side, too far the other. And the reality is sometimes laws are designed to be interpreted differently or to be challenged.

Alyssa Harvey Dawson (19:19):
And I think lawyers should know, I don't know how many lawyers we have in a room, context matters. Your business matters. It's not an academic, it's not a black or white thing. So one of the core things that we can do for our companies is to understand deeply our business. We obviously are going to understand a legal aspect, but then we have to juxtapose that with what's happening on the other side so that you can put some context around it so that you can have those more informed discussions. If you try to come in at something without that context, there's just going to be a mismatch. And you're also probably not giving the best advice or guidance to your business. And so it's a richer experience, quite honestly, to understand more about your business and what's driving it. It's a harder problem to think about solving, but it's a better thing at the end of the day if you can get in there and really truly understand the puts and takes.

JJ Bollozos (20:12):
Fair enough. Now let me throw out a scenario. I'll use the B word here. Y'all are in the middle of a breach, crap's hitting the fan. Assumptions are changing every day, but the one thing you do know is there was personal data breached. It's likely that the bad actor was going after it to steal their identities, but you're still in the middle of it. And so Kevin, you're doing your thing trying to fix it, but you're also cognizant of Uber, SolarWinds, things like that, personal liability for CISOs. Alyssa, you're trying to provide guidance to him. You say there's probably not enough information that's concrete enough to notify, Kevin saying, I think we really do need to notify. And so maybe Kevin, I'll go to you first. How do you resolve that conflict in the heat of the moment when you think there is a notification obligation, but your lawyers are telling you there's not?

Kevin Paige (21:12):
Yeah, I mean, I think this is where we have to compromise. We have to figure out together what customers are they, how big are these customers? We have got to build additional context together to figure out what should we do now? What are the laws? What countries are these people in? I think there's a lot of context that needs to come together to be able to make a good determination. Plus, if the business is focused on trust and the business is focused on making sure that we're sending notifications out, whether it is or it isn't an issue, then we should.

(21:51):
But I think this day and age, everybody's learned it's always best to lean towards notifications, especially with key customers, no matter what. Right? It's better to notify like, Hey, there's potentially a problem than not. And I think that we've, Legal and Security have learned that over the years that yeah, you want to make sure, and if it's a perfect world, you want to make sure everything is going to be perfect and you only notify the right things. But I think we live in an unperfect world, and if we have reason to believe, then we should go that way. And I think that from a legal perspective, some of that legal and security is going to come from the CEO and it's going to come from the other executive team members, some of their thoughts on it, as well as wanting to make sure we're doing the right legal thing. But I think that's how I would approach it.

JJ Bollozos (22:41):
And sorry to jump in there, but there's been some regulators that have actually warned companies against over notifying. And if you think about it, if you notify somebody that their identity might be stolen and then that turns out to be false, you've just caused them a whole world of pain and concern over nothing. And so I'm going to turn it to you. How do you kind of navigate that and manage that?

Alyssa Harvey Dawson (23:03):
Yeah, first of all, ideally you've spent some time before the incident coming up and developing your instant response plan, the criteria, the framework, your principles, the key questions that you want to have answered, the key things that matter to you, how it fits in with your company's principles so that you're not doing this live for the first time when there's an incident.

(23:33):
Now, it's not always a perfect world, but I'm telling you, if you don't have that plan, you might want to go back to your companies now and put something down on paper. And that's not done best by Legal, that's done [by] Legal, Security, the Product team and other management in your board as well in certain circumstances, depending on where you sit. So you have a framework that you're starting off with, you have some principles to guide you through that.

(24:06):
Then of course, it's going to get thorny because every incident is not treated the same, but if you start from something other than zero, you at least have some dialogue to talk about. And I do think you need to be careful not to over disclose. You need to be careful. We talk about this with the new cyber rules where everyone's like, oh, materiality. Well, maybe everyone just files an 8K the minute you learn that there's a breach, but there's a downside to that, a lot of which you mentioned, and that's not the standard. And so I think we need to be careful not to sort of go overboard, because you also might lose some principles there too. It becomes less... People don't take it as seriously if you're just like, yep, yep, breach. Yep. Jeff's telling you, you start to lose the audience. And if you work on what your materiality standards are, hopefully in most cases you're going to get there and you're going to be able to sort of negotiate the edges. That's what I would say.

JJ Bollozos (25:05):
And so within that framework, who ultimately owns that decision?

Alyssa Harvey Dawson (25:11):
Yeah, if you're public right now, your board. If you're a private company, your senior leaders with input. I don't know if you guys have heard of DACI model where you have a driver, an approver, those who are consulted and those who are informed. You can decide from your company. But I would say depending on the size of the breach, it might be your CEO, but I think you designate it. Again, don't leave it up to chance or if it happens, have people know who gets to decide and why. Maybe you have dollar levels or thresholds, but have it out there so that there's a clear owner and approver.

JJ Bollozos (25:49):
Okay, so we only have about four minutes left. I want to bring it back to something a little bit more collaborative. You all might know that Drizzly recently settled a case for cybersecurity issues in which the CEO was actually personally held responsible so that he not only has to build a cybersecurity program at Drizzly, but that will follow him wherever he goes after Drizzly as well. And so how do you collaborate together between Legal and Security to make sure that your executives really not are only aware of the risk, but understand the risk and are aligned and on board with you both about how to mitigate that risk? So Kevin, I'll go to you first.

Kevin Paige (26:35):
I mean, I think it starts with regular updates to the executive team and to the CEO and to the board, letting them know where things are at, how things are going. It really starts with communication. We can't let surprises happen. So I think the biggest thing is how do you minimize surprises and how can you really focus on making sure that the big issues are known? The unknowns are identified as being unknown, and how can you just make sure that you've got continuous communication? Because that's going to be the biggest aspect, because if things come out of left field and things are going to be surprises, then afterwards people are going to ask, Hey, what have you been doing? So minimize surprises and keep communication flowing.

Alyssa Harvey Dawson (27:21):
Yeah, communication, communication is so key. Hope maybe in your structure you can have time for a sort of regular update, a state of security, a state of cyber update that happens maybe quarterly, half a year. You can decide. Maybe you also start off by giving people just sort of a baseline assessment of where you are. You do your initial security assessment so people know what's positive, know where there's work to be done. And then you have, as Kevin was saying, that regular update, that cadence, so people are informed. People are aware, you know how you're tracking, you know how you're tracking against your peers, but it's just always key to communicate as much as you can.

JJ Bollozos (28:04):
Yeah, we see that it's coming in at the board level too, so it's almost top down, not bottom up, where the board is hyper concerned about it. So of course, that's going to make everybody downstream concerned about it too.

Alyssa Harvey Dawson (28:16):
Yep.

JJ Bollozos (28:17):
Perfect. Well, just want to thank you Kevin and Alyssa. We really appreciate it, and thank you all. I think we're holding you from your break, so we'll let you get on with that.

expand_more Show All

Explore More Sessions

Keynote

The New Frontier: Implications of an AI World

Alex Stamos
Watch Now
Moderated Panel

Privacy 2.0: Uncovering What’s Next for Data Privacy

Anna Westfelt, Rick Arney, Ryan O’Leary
Watch Now
Moderated Panel

The Flywheel of Trust: Personalization + Privacy

Cathy Polinsky, Jess Hertz, Julie Bornstein, Trishla Ostwal
Watch Now

Learn more about how DataGrail
can help your privacy program.

Our platform eliminates complicated, manual, and time-consuming privacy program management processes. We have 2,000+ integrations with the most popular business systems to help companies build comprehensive, automated privacy programs effortlessly.

close
Please complete the form to access all
on-demand recordings for this event.