Let’s Get Technical: Talking Privacy With Your CISO
By 2024, 75% of the world’s population will have privacy rights, but many organizations are still in the early stages of building comprehensive data privacy programs. Regardless of where privacy resides in an organization, it’s a technical challenge that needs security team support. Overstock’s Brandon Greenwood and Fanduel’s JJ Agha will unpack data privacy’s technical challenges and discuss how to speak with security teams about building lasting privacy programs that reduce overall business risk.
JJ Agha (00:09):
This was the session that was brought up, a conversation I had with a bunch of general counsels. They're asking me and the panelists, "Well, what do we need to know as GCs and data privacy officers? How do we actually connect with my CISO or security program?" Do I need to be very technical? How does my CISO understand the compliance and regulations that I need to support?" And so, this is our conversation about how we bridge the gap in our companies, what we've done in the past. There'll be laughs, there'll be tribulations through the conversations. We hope to have some anecdotal conversations, and then we'll leave some time for Q&A at the end. All right, so how many of you all are the DPO or have a DPO?
Chatting with a few folks, it's all different kinds of hats. Everyone just points at one another. "Are you the DPO today?" It's the same conversation that Alex brought up on, well, security or the privacy council is just going to be caught up with dealing with the things that fall in no man's land. So, this is our current state of affairs. We're still trying to figure out who's fully accountable. We all share some aspects of responsibility, but this is really the point of where we're at, who truly is the DPO, and who has full accountability or ownership to drive privacy through the company.
Brandon Greenwood (01:33):
I'll just give a quick story here, very brief, when we were first discussing this slide specifically. I don't know if you've talked with folks who either have at their organization a DPO, or have discussed this with the Spidey pointing at each other. A friend of mine back East... It's not JJ. A friend of mine from back East, we were talking about this after JJ and I went over it, about, "All right. Well, what were you looking for?" Because if you go and look at DPO as it's defined in GDPR, that can go either way. I could make a case for myself and I could make a case for legal, I could make a case for somebody else. I genuinely heard from this person, "You know how we decided it? Rock, paper, scissors." I was like, "I do not want to work there."
JJ Agha (02:20):
I think at the end of the day it's whoever pulls the short straw becomes the DPO. I'll kick this off, but I think over the last four years we saw the explosion go from, we care about privacy, GDPR was that kickoff, go from 10 to I think 75% on the screen. We're still in the early stages of this. I use the analogy we're flying the plane while rebuilding the engines. So, how do we build out that paved road to that future state, so we know that the rights and the policies that we're writing, the compliance requirements, actually end up becoming an actionable control, an actionable monitoring state that actually is useful for us and not just saying, "Well, it's a wish we care about data privacy. It's a wish, we care about controlling PII." This is the current state that I think we are as an organization.
Brandon Greenwood (03:12):
What's interesting about this is, how many of us wish that our security, privacy, governance compliance programs ramped up like this? They don't. The wheels of legal regulation, everything, they are slow. And sometimes, by and large, some of the things that we need to do from a technical perspective, they might not be in a long range plan. So we've got to find out, how do we do them? The scale sometimes isn't as great as maybe what we're looking at here, but these are all considerations and factors we need to think about when putting together the privacy program and working with our peers in the organization.
JJ Agha (03:51):
So, you're saying I cannot throw money at the problem.
Brandon Greenwood (03:53):
You can, I just don't know how efficient it'll be.
JJ Agha (03:56):
What about AI?
Brandon Greenwood (03:58):
AI will work.
JJ Agha (03:58):
All right. This is our current state. I use the analogy in the conversation, we're squeezing Jell-O. This is part of that data sprawl. We have assets that are all over the place. How do we really solve for it and take the first steps? I think you deal with a little bit of analysis paralysis. When you look at the problem, you look at the landscape, you go, "Well, my data's everywhere. Tech really doesn't want to support me. Security is dealing with the next incident, the next breach. How can they actually help and support me and foster the conversations?" So, it does feel like we're squeezing Jell-O because no one really owns the problem. It's a problem that's shared across multiple people. This really is, I think, the current state. It's okay. It's okay to admit that it's a little bit messy. It's not ideal. We're really still trying to figure it out, but I would say we're still trying to gather and wrap our hands around the problem that we're in.
Brandon Greenwood (04:47):
What I really like about this slide is you're using your imagination to see what comes next. Typically, in a data privacy program, you're looking at a few different things. You're looking at establishing the program, the establishment stage. When I first started putting mine together, that's what it looked like. I thought, "I've got this. It's in my hand. I'm ready to go." Then establishing that, doing that, the discovery, classification, mapping, the Jell-O started going everywhere, and then the maintenance phase of the program, and then the evolution stage. The Jell-O goes everywhere. Nobody's going to be able to get their hands around it. I could have used two hands. Jell-O is still going everywhere. It's normal. It's what we have to deal with, and a very good illustration of what that looks like.
JJ Agha (05:28):
I think the conversation too becomes, what is actually a priority? So, you have to work across with your security team and with your legal team to get context to figure out what actually is priority. What should I tackle? What's the asset class? What's the actual data type? What do we actually care about as a business before it's, "Oh, I found another thing that's not falling in the bucket. Let me just scoop it over." So, we don't want to throw things over the fence, but this really is the current state. We know that we're still trying to corral everything. We're trying to get into a better state, but it is okay. We feel you. I think we're one of the more self-deprecating groups of CISOs. We're still trying to figure it out as each day comes by.
When we transition to the next slide, our assets now, both on a technical side, are no longer owned by us. Our assets aren't the end points that you have in your data center. They're no longer actually in a physical laptop. They're in your SaaS providers. It exploded everywhere.
Brandon Greenwood (06:25):
I'll take this one. The cloud paradox, essentially I think the slide speaks for itself, very much like the Jell-O. We've got data, to JJ's point, all over the place. Where is the PII? Where is the other maybe IP that we need to be worried about? When we're looking at and discussing this, I'll give a quick anecdote with one of my experiences at Overstock, and this was years ago where-
JJ Agha (06:49):
Not now, definitely not.
Brandon Greenwood (06:50):
No, not now. No, it's Bed Bath and Beyond now, but we were looking at this problem saying, "All right, we need to rein this in or at least identify it." A lot of it had to deal with things like shadow IT, but as part of that, all right, we know that with shadow IT, it's a procurement problem in many ways. It's a technical problem in many ways. An example is we had, I think it was Trello, Asana, some Microsoft product, everything doing sprint tracking and everything else. There are some challenges with this. There's obviously the security challenges that come along with this. I don't just mean with the data itself, but the actual access to the platforms. Are entitlements correct? Are we doing the provisioning and de-provisioning correctly? Do people share things in Dropbox open to everybody? If so, how do we federate the management Of that? So, really thinking about and considering those things.
Compliance and legal issues obviously factor into that. Considering what we are being asked to do by the legal team and sometimes vice versa. We've got to consider those specific things and how technology can help us to get our hands around for... I don't mean that based on the Jell-O slide, but how do we wrap our heads around this, and make sure that we are effectively and efficiently managing that data, managing the access to a point where we can effectively communicate to maybe, say, an auditor or anybody else who's going to come in a level of due diligence that might help us if something goes wrong? We obviously don't want anything to go wrong. We want to put in the right controls, be the detective, preventative, to make sure something doesn't happen. But if it does, how do we really go out and get it out to either in the legal community from a PR perspective, "Hey, we really are and did do what we were trying to do, but these things happen."
JJ Agha (08:48):
To add on to that, when you look at the cloud paradox, it becomes, again, what do I prioritize? The themes go back to what actually is there, how am I monitoring it? They are the same problems. It's the risk and reward conversation, because when you look at it, and I will use the words generative AI, or you look at OpenAI, I bucketize the problem of solving for gen AI in the first phase into this area. Is the data a small problem? Is this the data access? Where's my data going? It really falls squarely into this maybe not perfectly shaped hole for this to fall into, but this is where I'm tackling generative AI as a problem about where's my data leaving, where's it going and how is it being processed by a third party.
Brandon Greenwood (09:31):
Data lineage around it.
JJ Agha (09:33):
So, this really does become another conversation, third party risk and who has your data? What are they actually doing with it? What's the subprocess? Who's behind that? All the conversations that we're having just become the same problem or the same challenges that you do need to solve for whether you're a CISO or the general counsel. It is the same paradigm, the same parallel problem to be software, but you're just asking it at different points. Your legal team's asking a different set of questions and your security team's asking the same set of questions to deduce the same set of risks and problems that we want.
So, I'll add this, that it really is a technology problem. So, you can't throw a bunch of T's and C's as many data privacy addendums, data security writers that you throw on top of your contract. That's not going to protect you. Your CISO's not going to be happy about it. The general counsel may be happy because we kind of ensure there's appropriate indemnifications. But guess what happens with those indemnifications? Something happens, cyber insurance gets pulled down and everyone gets paid out. You get $0 out of it.
So, what did you actually do to reduce risk? You had a piece of paper that said, "I reduce risk." Really, when I look at the data privacy conversation, it is an access problem, and we're still struggling candidly as a security function. We are still struggling with the access management problem. What are our assets? Who has access to what assets? When I look at data, I still look at it as a piece of asset because it is an asset. It lives beyond the four walls that we exist out there. So, that's kind of the mental model that I always approach when I'm looking at data privacy, data security, or eventually the generative AI conversations.
Brandon Greenwood (11:17):
I really like that, and when I look at this, and I like what you said there because when I first saw this slide, first discussing it, literally, I just thought of this while you were bringing this up, but the technology problem being the enablement. So, when I first saw this, I was thinking, "All right, technology, we can control it through security controls, security mechanisms, policy," but I think this slide, it's more than that.
If we didn't have the technology, would the problem be as big as it is today? That being said, we have to do something about it and things that might be within our control, either natively or that we have to go out and spend a little bit of money on, but we do need to address the issues here with respect from an ethics perspective, what does this mean? Societal is another really key point here with how we manage this data from a technical perspective and use technology to not only identify that detective capability where it's at, but also to be able to go in and prevent or do just in time training, those types of things.
I don't think it's a matter of yes, no binary one, zero. As CISOs, we kind of have to enable the business and through that we provide guardrails and hopefully make it easy and engaging for people to do the right thing. And that's kind of what we were looking to do here is use technology to help people get their jobs done, be efficient, be able to get it done right with the data that they need and be able to manage that not only access, but where and how the data is used.
JJ Agha (12:45):
The idea is that we want to give context to the decision. We're not going to be the gates that says you could do it or not. How can we provide the right governance and the right controls on top of it? But I think Alex mentioned it earlier in the keynote, if someone is telling you that this technology is going to solve everything, they're lying to you. It's just the way it is because either your organization's not going to be able to actually appropriately support it, you're not going to have the right staff or processes. So, there really is no silver bullet on solving this of just slapping the technology on and saying this has solved for this. This is an ongoing, early innings conversation on, how does technology really enable us to secure it? Because technology is the one that actually created the challenges and problems that we're in right now.
So, it really kind of brings a thought-provoking conversation. This popped up by Sounil Yu who's the CISO at JupiterOne on a recent podcast for security folks on, are CISOs now the CFO of IP? And it took me a second, I was like, "Well, let me think about that. What actually does the CFO do?" And I think it was really kind of that watershed moment when you looked at Enron, CFOs became very powerful in the conversations of providing the right governance, the right ways to use money, but they were never going to be the ones of, "I'm going to approve everything."
There's tiers to go through that conversations. So, it's similar, I think, when you think about data privacy and when you think about the data access that happens, it's the I'm going to give you the appropriate policies, I'm going to give you the right tools to make better decisions on how to use that money, but at the end of the day, the risk and reward is going to be beneficial of using any of these tools to really have access to your data to unlock personalization, to unlock all the benefits of having data, but without putting privacy at risk with doing so.
Brandon Greenwood (14:37):
And I think not only the CFOs of IP, as Sounil mentioned, but also of PII and that might be categorized or classified as IP in your organization. We were at dinner last evening and kind of talking about, of all things, shocker, generative AI, and I just think AI in general, and we went over an analogy of, all right, everybody I think knows that we need the data. There's no question about it, whether it be through AI, whatever it is, we just need to make sure it's handled responsibly and how do we do that? We kind of made the analogy going back to cars. So, we know people want to get places faster in a vehicle than they do in a horse and buggy or by walking. All right, so how do we put together, as I mentioned it earlier, those guardrails or controls, it's going to make it safe for everybody?
All right, well, let's use your seatbelt. Use your seatbelt, we're going to give you access to this vehicle that you can cause a lot of damage with and to yourself or someone else and try to be as safe as possible. I'm wearing my seatbelt not only for how terrible... I'm from Utah, so I do not drive well, but I'm also worried about the rest of the Utahns who don't drive... Right, Matt? Who don't drive well. So, I'm going to wear that seatbelt and you know what? I'm going to be looking for a car... I'm proactively looking...
All right, do I have the airbags or are they kind of with, I think it's these... What is it, the Tesla where if I'm sitting just right, it changes kind of how the airbags come out for better protection? That's kind of what we need to do here. We need to be able to enable the business with the seat belts, giving them the data, which is the car and saying, "All right, here are the things that you can use to make this journey a little bit safer." And when I go back and talk about, all right, I'm the one going out and proactively looking at the Tesla airbags, when we look at this from an IP or PI perspective, I think a really good example of this when we're going out and looking at it from a privacy perspective is Apple. I think Apple does this very well in using it as a marketing advantage. Maybe some other folks don't.
JJ Agha (16:40):
We're going to use a marketing ploy?
Brandon Greenwood (16:41):
JJ Agha (16:42):
We're going to use a marketing ploy?
Brandon Greenwood (16:43):
JJ Agha (16:45):
But if you take the analogy on the seatbelt, we even talked about it, folks were complaining about I need to put a restraint on myself before I drive this car 40 miles an hour? This was what, 40, 50 years ago where seat belts weren't mandated. I think you're now, again, going back to those early stages, we're going to start seeing better roads being built out, better guardrails to prevent your road from careening off of it, stoplights.
These are all built to make sure that the use of the risk of reward, of going super fast in a car, of going 80, depending on speed limits, is done in secure way so that you are able to benefit on those rewards without putting yourself too much at risk, and that's again the same common theme when you look at from security to the general counsels and legal, we are just here to reduce risk, educate the business on risk, but we are always coming at it in different ways. Here's the technology aspect and here's the compliance. Let's make sure we hit this compliance mark is that common thread that binds us together, that commonality is really what drives us to work together each and every day.
Brandon Greenwood (17:52):
And if we look at privacy, so this slide was put together or the data for this at least, this first portion of the slide, it was from Forrester basically asking, all right, which of your teams in your organizations, which business units are part of your privacy program? Maybe it is a task force or some other group that's been put together to kind of go over and talk about... And you'll see these oftentimes for cybersecurity council and privacy and everything else, and this was kind of the breakdown. I believe this was done sometime last year, I'd have to go back and double check, but anyway, you can see that IT and technology and security are the top two. I would argue with where legal is at on this deck or on this slide, excuse me-
JJ Agha (18:37):
I think that just shows legal's smarter than security.
Brandon Greenwood (18:40):
And maybe that's it. Maybe they were the ones who... The Spidey man who was pointing at the guy who actually ended up being the DPO, but I think some of these should be higher. Legal is just the one that kind of stands out. If we move to the next portion of this slide, this is where we should all be targeting, and the best way I think we can do this is by using our influence. It doesn't matter if you're in security, privacy, legal, marketing, anybody in this room, being able to get to something like this takes some work, it takes some effort and it's going to take partners for anything that we're doing, whether it be privacy or anything else, but this is very important and it should impact everybody in the organization, in most organizations. So, being able to get better representation across all groups, being able to have a diverse group that can go ahead and talk about these legal issues, the ramifications of maybe implementing a policy or procedure at your organization is going to help you a lot.
It's a lot like design-driven development where you go out and talk with who your customers are and say, "All right, what do you want this to look like? What do you want this new feature to be able to do for you?" Rather than JJ and I sitting in a room and saying, "You know what? We think they will like this," putting it out there and nobody buys it. We need to get folks to buy this, and it really kind of starts with the people in this room being able to influence, regardless of title.
JJ Agha (20:04):
For folks who work for technology companies, which I think it's everyone here. We are always enabled by technology. You'll have your product or engineering counterparts write a tech spec, a product requirements document, and they'll always go through the conversations of, "Well, what are actual dependencies?" That's the opportunity for legal and security to also say, "There's a compliance dependency, there's a security dependency on it." This really started off with... I loved hearing... I think Jess mentioned privacy by design documents, PDDs. So, when GDPR came out when I was at WeWork, we created and leveraged PDDs as part of the conversation. We were doing a lot of low level IoTs, we were collecting a lot of biometrics, supporting way founding, and this was a conversation with BIPA for Chicago, but also with, how do we actually protect... And again, there's reciprocity to the data that we collect, but also to the use cases of what we're actually using it for.
So, that was really one of our big principles when we thought about the privacy by design documents, but it's a really good, if you are not part of it now, go back to your engineering team, go back to your project counterparts and figure out a way to include legal or any compliance requirements as part of the kickoff phase. If there is... Getting that embedded now is going to be a lot cheaper than it is doing it later in the cycles. So, if you don't have privacy by design, but it is something because you are collecting a lot of information, definitely lean into it with your product and engineering counterparts and reuse what they have. Don't create a gigantic new council because you're going to get a lot of eyes rolling when that happens.
So, you've heard me talk about us still being in the early innings. When you look at both from a security perspective, as well as legal. If you look at the old days, cybersecurity council was kind of outsourced. It was never kind of in-house. It wasn't something that you wanted to invest in. It was, "Ah, we'll figure it out." Similar is happening within the CISO kind of persona if you will. It's no longer that IT, that professional that became now a security CISO. They're now security engineers, they are legal with backgrounds in legal. You're seeing that kind of maturity of more density to the individual versus just this pure IT individual that makes sure my things are working. It's becoming a lot more mature. So our present day, we're now seeing a lot of the in-house data privacy councils, our cybersecurity councils, that's all moving in-house each and every day.
Brandon Greenwood (23:23):
Do we have one more?
JJ Agha (23:23):
Brandon Greenwood (23:26):
Where are we headed? I think that was a very good intro to this. Board level reporting on risk, this is already happening in many organizations. We actually, my organization now, we have a CISO on the board, CISO for HP is on our board. So, they are understanding this a lot more, and this individual, the CISO at HP also has privacy and governance who reports up through them as well. So, your organizations, if they haven't already started having those discussions, if not gone out and hired somebody with a technical or security background to be on a board, be it advisory or otherwise. This is definitely something to start getting on leadership's... Your CEO's radar. Hey, I think this makes sense and it shouldn't be viewed as we've got folks in here from legal. Almost everybody in here is going to have somebody on your board or advisory board that's in legal.
This is not a knock. I think it's very good, this new... At least the draft rule that the SEC just put out talks specifically about this, making sure that there was experience from a cybersecurity perspective on your board. Now, that didn't make it into the final rule, but it got the ball rolling. It got folks talking about this and I don't think that's going to lose momentum. While it might have lost a little bit of, hey, you have to do this, I think ultimately we are going to find organizations while they are not being told or asked to do this, they are going to do it because it does mean a lot, it does resonate.
And these are the types of things that you might also start at some point seeing that leadership position from a security perspective elevated even maybe a little higher than it is. I think most folks in this room probably already have the "seat at the table", speaking the language of the business. Well, the language of the business is turning to data. It is, whether it's data you have yourself in the cloud, that is becoming the language of the business, data, IP, PII. It's not Ethereum, the new digital currency is data.
JJ Agha (25:24):
And on top of that, I think if you look at Equifax, Equifax is kind of that watershed moment for us as CISOs where we're now becoming similar to analog for the CFOs. We actually are being looked at of, "You are an executive, you are an officer." So, when you look at the future state, I think there's a convergence of where risk used to be passed through the audit committee, passed through the general counsel, whoever owned enterprise risk management. You're going to start seeing it where the general counsel is just going to be, "Well, here's the CISO," and I've done it before at previous companies and presented directly to the board, but now I think it's where you're going to have more of a seat at the table and presenting to it and kind of giving feedback on all things, not just come in here for an hour, give us a security update and move along. You're going to be part of the actual board meeting to really deduce what is the risk that the company cares about.
Brandon Greenwood (26:20):
Not just audit committee, 100%.
JJ Agha (26:24):
So, we lean on legal a lot. We cannot do what we do each and every day, put out fires, deal with fires, get things thrown at us if it wasn't for legal. There's a ton of things that if I'm dealing with this security incident for the first example, I have to kind of lean on with legal, but there's some conflicts. It's not all the great conversations of roses and kind of butterflies when you're working with legal. There's sometimes conflict that does occur when you're dealing with an incident or vendor management, regulation updates, but for incidents, you really want to be kind of in lockstep within your general counsel. So, when I was at WeWork, there was a vulnerability that was deemed or looked at to be an incident. There's really a conversation of, well, this actually isn't an incident. An attacker didn't take advantage of the vulnerability or the weakness or the risk, but she was very adamant that this was an incident.
I was like, "Well, you're just going to waste a lot of my team cycles. Why do we need to go do this?" "No, it's an incident. I'm going to classify it as an incident." I'm like, "Well, based off our policy and based off what our description is of a vulnerability, it's not, so how are we classifying?" "No, it's an incident." So, those are the things that sometimes just don't go well if you're leaning on with your legal for an incident, but the flip side is very true. On an incident, you get attorney client privileges, you get ACP, they're your best friend. You could ask for [inaudible 00:27:51], you could ask for a lot of feedback. "Hey, this is kind of our thought process. This might've happened with the data leak. Give us feedback." And I think there's a point where each state has their own data breach requirements.
That is a nightmare to deal with. I've dealt with it multiple times. There's actually someone who was part of the council that worked with previously here and when anytime that happens, it is an absolute problem, but we actually farm it out to the legal team, and say, "All right, you go figure it out. Here's the data set. Go work with outside counsel and let us know what we actually need to do to support." So, there is that, again, reciprocity to again, just try to support what is at the end of the day reducing risk for the business.
Brandon Greenwood (28:34):
And I think incident responses here, regulation updates, they speak for themselves. Vendor management's kind of the one where you're like, "What? Is that really..." Yes, it does belong up here. Reason being is we're not only worried about the vendors who might have access to our data, whether it's in our tenant, their tenant, whatever it is, but I believe it was Rick who was talking about DSRs in that panel this morning, talking about when we're servicing... Who were servicing DSRs, when we're talking about vendor management, it's not just third party risk management. Who loves doing the TPMs or filling out or submitting those third party risk management questionnaires or surveys? Yeah, they suck, and real quick, I thought with... Was it Jessica and Trishla? They dropped some... I think it was crap in hell. We got to pick it up, man.
No, but seriously with vendor management and the difficulty it is or challenge it is sometimes to not only submit those or get those done the first time, but then to repeatedly or on an annual basis, whatever it is, resubmit that questionnaire, a lot of automation is happening in this space. When we first started doing this back in 1918, whenever it was, it was very manual, not only on our side but also manual on the part of our vendors or partners who we might have to request for some data to be deleted or discovery, whatever it is. We automated much of that with some tools that we might be here for today as part of that process on our side to make it a lot easier, at least from a DSR perspective. It used to be when we would get a submission in, we would have to email all of these vendors who could potentially have this data.
Well, we've got that dialed into... We know exactly who should have that data, and they now have automation where they've said, "We trust you've done your due diligence and that if you send us something to take action on a request, "We know you've got it automated, so we're more comfortable with that as well," and we just ship it off. And there are still some organizations who are still kind of going up that maturity curve we looked at a little bit earlier, but this is getting a lot better. Still, it's kind of crummy to do sometimes, but it's getting better the more we can automate in this space and others, I think it's just fantastic.
JJ Agha (31:01):
It's two CISOs here saying, we need your legal help and general counsel to make us better. There's no way we could do this without you, and I hope you feel the same way, you need us to help actually implement the technical controls. But going back to it, we all care about reducing risk. We all kind of, I think come at it with a different set of tools. You come at it with your compliance regulations, your law degree, something I do not have and do not want. I'll come at it from an engineering background, but at the end state, all the way far right, all we care about is reducing risk. And the common way to do that and actually enable that quickly is by doing it with technology. That's the kind of binding factor that really does reduce risk for us as a whole.
But when you're looking at oversight, sometimes you will not agree with your legal counterpart on their reading and their viewpoint on actually affecting this compliance or this requirement. You might actually want to solve it in a different way. Those are just common issues that you're going to be kind of having that tug and pull of saying, "Well, based off of this and the enforcement of that law, this is what we might need to do." The business is telling me no. They want to jump all that data down, this one vendor, the reward is way too high. We can't go down that path. So, it really does put us sometimes in weird positions where engineering is asking us to do one thing, security is saying we could actually implement this securely, but legal's saying, "Absolutely no way. We cannot do it because the SEC is requiring us to do monitoring."
And this came up with kind of our monitoring controls for off channel communications. I really don't care about that. I care about MDM, I care about who has access to my data. My legal team is using that piece of technology to prevent you from using Telegram. Now, I'm in a bind. I can't actually implement MDM because it turned into a conversation of we're going to monitor what everyone says and prevent off channel communications. So, those are just the common pieces or common challenges that will pop up that the technology will be used in different ways, it's a tool. There's a great tweet. I would say that someone said a spoon is a great utility, I could eat cereal with it, but I could also scoop your eye out with it. It is true, and it's the same thing with generated AI or any of these utilities that we have it's with great power comes great responsibility about actually using it for the appropriateness and within responsible reasons.
Brandon Greenwood (33:36):
And I would say I like this slide a lot. I like the magnets saying, "Hey, we have this pull, this thing that we really like to do, but at some point we need to come together to get to that end state." Another interesting-
JJ Agha (33:47):
Would you say opposites attract?
Brandon Greenwood (33:49):
I was just going to go there, flip those magnets around, and then they're trying to push that stream together. Maybe that's what we want if we need to disagree more to get to that end state. I'm kidding, but it's interesting to kind of think about that. Where we want to go here though is with the lexicon issues, and that's probably when we've got the opposites attract, we've got the magnets pushing against each other. I did not stage this. I purposely did not say anything to JJ about this, but JJ is a big sports fan.
JJ Agha (34:18):
Working at FanDuel I have to be.
Brandon Greenwood (34:21):
You got to be, right? So, if I were to use the term or the word block from a sports perspective, what does that mean to you?
JJ Agha (34:29):
Depends on what sport.
Brandon Greenwood (34:31):
So, I don't usually hear that from legal, but that's damn good. I'm going to throw one out there at least, that's a dang good answer, but if I say, "Block the shot," what does that mean?
JJ Agha (34:41):
Attempt to score a point was blocked in basketball.
Brandon Greenwood (34:46):
Very good, and from football blocking someone is if you're... I want to illustrate, I know everybody's not into sports.
JJ Agha (34:53):
Do we need-
Brandon Greenwood (34:54):
No, I'm not going to do it, not with you, but blocking somebody... The word block in football is a lot different. It's physically getting in, beating somebody up and keeping them from making a tackle, bringing the ball down. So, when we're using the same words, and this is a challenge, I'm sure everyone... And it might not even be between security and legal, it could be between your operations team and your tech operations team. They're talking about operations, but they're two different things. And the point here is to kind of illustrate while we're on that... If we go back to the last slide, we're all on this same path to an ultimate end state that we both want, but when we're speaking the same but a different language based on the business unit that we're in, sometimes it takes longer to get there. Hopefully, we ultimately get to that spot.
But how do we really do it in a more efficient way, I guess is the best way to put it? And I've got a really good book. I have all my security leaders and really leaders, I've got IT as well, I've got all my leaders read this book called Never Split the Difference. And there's discussion around that, some negotiation capabilities where they're talking specifically about, "Hey, make sure you understand what that person's saying to you."
So, it shouldn't be the other person's responsibility... In Brandon's opinion, it shouldn't be my general counsel... I shouldn't expect him to know exactly what I'm talking about. I should illustrate the best way I can maybe what I'm talking about, if there could be some confusion or if I hear him say something and I know that it might be something could be construed differently for maybe what my team would consider, I always try to reemphasize that or say, "Okay, so here's what I hear," and I try to really to get into and say, "I want to make sure the lexicon, the vocabulary, the terminology we're using is consistent between us because we need to get to that end state as quick as possible." There's nothing worse than having a disagreement without knowing you're having a disagreement. That creates a lot of problems.
JJ Agha (36:57):
So, I would pick on some of these words as well, but make sure you walk away talking to your security program or to your general counsel of understanding exactly what these words mean and what they trigger. We won't use the breach word, but when you use that, it evokes a very specific action that we all want to kind of prevent on what that looks like. So, at the end of the day, when you look at privacy risk, it is really a common factor that binds us together, is what security cares about, is what legal caress about, but honestly, it's what the entire business caress about because when you think about even the conversation of personalization, I was laughing, the previous talk, you will get 10 different outcomes when you talk about personalization.
Brandon Greenwood (37:43):
Really good example.
JJ Agha (37:44):
You'll get 10 different ways to solve a personalization depending on what company you're at, but it really is a common factor between legal and privacy. We care about reducing risk. When you look at privacy, privacy is a risk that we want to deduce and reduce for us. So, this is kind of a common factor for all of us.
Brandon Greenwood (38:03):
And definitely understand that it's a technology challenge, problem, issue, challenge, whatever it is. We can't do it and T's and C's, we can't do it with policies, procedures, other controls, not individually or by themselves. We've got to have something to the teeth, if you will. Those are the good teeth, but you've actually really got to have something that can enforce, notify, inform the folks, not only taking this data in and saying, "All right, we've got either a discrepancy here or this is a place where we need to put some action," but empowering end users or associates in the organization to what they could, can, or shouldn't do, and making that very clear.
It shouldn't be a, hey, you hear about it because security sends you an email saying, "Hey, don't do this or we notice this." The more that we can do to automate and educate, raise the awareness, educate is probably the wrong word, but raise the awareness on what's good and bad or what's acceptable and not, especially from a privacy perspective and the risk there is only going to better the organization. You don't need to worry about, all right, once a year I've got to watch this 10 minute video and I'm going to know everything I need to. No, the more we can do it real-time, the better.
JJ Agha (39:16):
It's a team sport, so make sure you're all on the same playbook. It's not the most fun things to do, but going through crisis management, going through kind of a breach procedure at tabletops, doing that with both security and legal, it is one of the... When you go through it, it is one of the most fun activities when you're actually in it and you actually build a better relationship. And regardless if you kind of find all the problems throughout, which you hopefully do, the end state will be a better relationship between you and the security team or you and your legal team on exactly what to do when a time of crisis happens. So tease those out, make sure you're on the same playbook. Make sure you're all on the same call cadence. That is really just helpful for building the atomic habits, if you will. Folks will just know what to do and it'll just become repetition after that for us.
Brandon Greenwood (40:12):
Isn't this weird? A team sport, a playbook, the first innings, what's going on here? No, we are early in this process. I think everybody knows that we are early in this process. There's a lot of work that needs to be done. I don't think just in the technology and the security side, but the coordination with legal, the coordination with other folks in the business who are going to help us get to that end state a lot more quickly, efficiently, and with less pain. So, look for opportunities to partner with those folks.
I mentioned influence a little bit earlier. That's a big one. If folks really know how to influence and be... Maybe say if you're not, I mentioned earlier, I imagine we've got some folks sitting at the table right now. If you're not, even without that title, you can still be a leader. You can still influence without a title and ultimately you're going to get there if that's where you want to be maybe, but it's key to be able to establish when we're talking about the first innings and that teamwork, nothing could be more true. At least when we're dealing with privacy, security, legal components, regulatory issues, they carry real consequences. Rick, he talked about it. There's going to be some things coming up it sounds like. So, make sure you're prepared.
JJ Agha (41:29):
Then I'll just iterate, do not have analysis paralysis, make decisions. You're going to make wrong decisions, your security counterpart is going to make wrong decisions. The best thing to do is just continue moving forward because if you're sitting in your care period not doing anything, I do not want to be in that situation.
Brandon Greenwood (41:48):
Well, thank you, everyone.
JJ Agha (41:49):
Thank you all.