close
close
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

View Webinar

Thank you for your interest!

Please click below to view the on-demand webinar.

With new state-level laws like the Tennessee Information Protection Act and the Minnesota Consumer Data Privacy Act taking effect in July—and the next phase of the EU AI Act on the horizon in August—understanding the nuances of these regulations is essential.

During this conversation, we’ll break down the latest consumer privacy laws. You’ll gain:

  • A breakdown of upcoming privacy regulations and their implications
  • Common themes and what’s needed to ensure compliance for upcoming laws
  • Retail-specific strategies for navigating data privacy challenges
  • Actionable advice to adapt and stay compliant in a shifting landscape

Join us for this engaging conversation designed to equip you with the knowledge needed to confidently navigate today’s fast-paced data privacy environment.

Speakers


Jasmine Sharma
Privacy Community Manager at DataGrail

Monique Altman
Senior Privacy Program Manager at Ping Identity

Dwight Turner
Compliance Analyst at SimSpace

Julie Cantor
Founder, Cantor Law, LLC
View Transcript

1
00:00:03.100 --> 00:00:11.559
Jasmine Sharma: Hi, everyone, and welcome thanks. So much for joining us today for our webinar on how to navigate new State level consumer privacy laws.

2
00:00:11.590 --> 00:00:31.480
Jasmine Sharma: It's a busy time in the privacy world, and we know how much is landing on your plate just this year. 8 new Us. State privacy laws are taking effect, bringing us to a total of 20 State privacy laws in play. So that means a lot of moving parts, new rights to honor, new obligations to document

3
00:00:31.480 --> 00:01:00.419
Jasmine Sharma: as well as new questions to answer across your teams. But you're in the right place. We're here to break down what matters most for privacy and legal teams right now, and also help you get ahead of the July deadlines with confidence, and fortunately we do have an amazing panel. Our expert guests will share their practical tips on how to prepare for all these regulatory changes, plus, we'll point you to some helpful resources, so you can stay ahead of what's coming next.

4
00:01:00.560 --> 00:01:29.889
Jasmine Sharma: and just some quick housekeeping before we go ahead and jump in for today. A recording of today's webinar will be emailed to you after the session. So no worries if you do miss something, and in this webinar you are muted by default, but feel free to use that Q. And a feature at the bottom of your screen anytime, so we'll go ahead and answer as many questions as we can during that live panel discussion, and if we don't get to yours, we'll follow up with you directly.

5
00:01:29.930 --> 00:01:43.550
Jasmine Sharma: And then finally, a quick reminder that this webinar is not legal advice. It's designed to help guide and inform you, but we always recommend reviewing with your legal counsel, to figure out exactly what applies to your organization.

6
00:01:44.380 --> 00:02:00.230
Jasmine Sharma: All right, with that out of the way. Let's go ahead and get started. Then I'm Jasmine Sharma. You're a moderator for today's session as the privacy community manager at datagrill. I am a licensed California attorney and have specialized in privacy compliance.

7
00:02:00.230 --> 00:02:17.179
Jasmine Sharma: My mission is to help raise awareness about privacy regulations and really provide resources that enable professionals like you to just stay ahead of that curve. So I'm really excited to be here and just guide you through today's session. So let's go ahead and get right into our agenda.

8
00:02:18.230 --> 00:02:30.799
Jasmine Sharma: So here is a quick look at what we'll cover today. First, st I'll walk through the 2 new State privacy laws taking effect this July, starting with the Tennessee Information Protection Act.

9
00:02:30.800 --> 00:02:48.520
Jasmine Sharma: followed by the Minnesota Consumer Data Privacy Act, and then we'll spend a bit more time on Minnesota, since it does introduce some pretty unique requirements there, including new rights around profiling. And then specific rules for handling data. Subject requests involving sensitive data.

10
00:02:48.540 --> 00:03:01.540
Jasmine Sharma: And then, after that, I'll go ahead and give a brief overview of the EU AI act its upcoming phases. And then what privacy teams should kind of be watching for as that enforcement timeline unfolds.

11
00:03:01.790 --> 00:03:08.920
Jasmine Sharma: and then we'll go ahead and open things up for a Q. And a. Where our expert guests will weigh in on your top questions.

12
00:03:09.090 --> 00:03:24.920
Jasmine Sharma: As I mentioned. You can go ahead and submit your questions anytime, using that Q&A function at the bottom of your screen. And then, if time allows, we'll go ahead and work in some of those questions during the actual panel discussion. Otherwise we'll try our best to follow up with you directly.

13
00:03:25.200 --> 00:03:33.439
Jasmine Sharma: Finally, we'll go ahead and wrap things up with a few essential resources just to help you and your teams get ahead of all the changes coming up this year.

14
00:03:33.640 --> 00:03:38.410
Jasmine Sharma: So with that, let's go ahead and jump into our agenda altogether oop.

15
00:03:39.780 --> 00:04:06.170
Jasmine Sharma: So today we're lucky to have an expert panel here to help us guide through all these changes they'll be sharing their insights on the new privacy regulations and offering their perspectives on what it means for privacy teams. So please do join me in welcoming Monique Dwight and Julie Monique. Let's go ahead and start with you. Could you please 1st introduce yourself, and then maybe tell us a little bit about your role and background.

16
00:04:06.170 --> 00:04:15.149
Monique Altman: Sure happy to so. Hello, everyone. My name again is Monique Altman, and I'm a senior privacy program manager at ping identity.

17
00:04:15.330 --> 00:04:28.929
Monique Altman: So my journey into privacy began with a Jd. And an Llm. In data privacy law. But more interesting, I think, to the group, probably is professionally. I've had the privilege of working as a privacy specialist

18
00:04:29.160 --> 00:04:44.559
Monique Altman: in diverse and dynamic environments. So my experience, it spans from my time as a consultant with a big 4 firm advising a wide range of clients to more hands-on roles within the Med Tech Sas and entertainment industries.

19
00:04:44.930 --> 00:04:56.690
Monique Altman: So what truly draws me to data privacy is a deep seated belief. And that belief is that we all have a fundamental right to our personal data and a say in how it's used.

20
00:04:56.860 --> 00:04:59.279
Monique Altman: This conviction drives my work.

21
00:04:59.380 --> 00:05:10.360
Monique Altman: and I am committed to honoring consumers right to privacy, and believe that the most effective approach to data privacy is one that is ethical, pragmatic, and efficient.

22
00:05:10.560 --> 00:05:16.960
Monique Altman: It's about finding solutions that protect individuals while also enabling innovation and business growth.

23
00:05:17.260 --> 00:05:21.039
Monique Altman: I look forward to sharing my insights with you today. Thank you.

24
00:05:21.970 --> 00:05:27.389
Jasmine Sharma: Thank you, Monique, and next, Dwight, if you could. Please share a little bit about yourself as well.

25
00:05:28.890 --> 00:05:31.840
Dwight Turner: Everybody. Thanks for having me. I

26
00:05:32.010 --> 00:05:56.510
Dwight Turner: am originally maybe a a nonprofit guy who got into got into tech in particular cybersecurity. Currently, I'm a compliance analyst at Simspace. And I work a lot on Sock 2 and Cmmc compliance. And so if you're familiar with Cmmc, we talk about controlled unclassified information, as they call it.

27
00:05:56.570 --> 00:06:04.490
Dwight Turner: There's a lot of a lot of what I do in my everyday role is really similar to what you might do in a privacy role, trying to track down where this

28
00:06:04.600 --> 00:06:23.999
Dwight Turner: data exists, trying to make sure we have the proper controls on it, trying to look at risk and vendors and other types of compliant issue compliance issues. And, you know, kind of corralling everybody to make sure we're protecting it. Making sure it's in our contracts and

29
00:06:24.462 --> 00:06:37.859
Dwight Turner: and that everybody knows how to talk to the auditors when that time comes around as well. So it's very closely related to privacy. And I I'm passionate about just talking to privacy in, in

30
00:06:38.070 --> 00:06:44.059
Dwight Turner: improving our privacy, literary literacy in our families and communities, as well as

31
00:06:44.660 --> 00:06:49.990
Dwight Turner: as well as what's coming up next. AI governance and excited to talk to everybody about this stuff today.

32
00:06:50.990 --> 00:06:55.759
Jasmine Sharma: Perfect thanks. So much, Dwight. And now, Julie, we'd love to also hear from you.

33
00:06:56.780 --> 00:07:20.329
Julie Cantor: Hi, everybody! My name is Julie Cantor. I am the founder at Cantor Law, Llc. Cantor Law, Llc. Is a law firm that provides employment data, privacy, and commercial support to in-house teams. I 1st started as an in-house counsel about 11 years ago, and my practice has generally been in-house at high growth companies doing a variety of roles.

34
00:07:20.340 --> 00:07:43.580
Julie Cantor: I 1st started working on privacy when I joined the team at studs, the Earring and Ear Piercing company in 2022 as their associate general counsel, and was tasked with creating a data privacy program that was super comprehensive and would be meaningful to our customers and understandable to our employees who ultimately had to implement it. And so we ended up

35
00:07:43.580 --> 00:07:51.239
Julie Cantor: engaging datagrill, and they were a great partner to us, and it's wonderful to get to be here and can't wait to talk about

36
00:07:51.360 --> 00:07:53.030
Julie Cantor: the updates to the laws.

37
00:07:53.730 --> 00:08:06.019
Jasmine Sharma: Yes, thank you so much for that, Julie, and thanks again to all our panelists for those introductions. So without further ado, then let's go ahead and jump into these new privacy laws and kind of see what they mean in practice.

38
00:08:06.480 --> 00:08:32.349
Jasmine Sharma: So let's go ahead and start talking about Tennessee's Information Protection Act, which is set to go into effect on July 1st at its core, Tennessee does align with many other State privacy laws that we've seen. Granting that full set of basic data subject rights. So these are the rights that individuals do have over their personal data and under Tennessee law that includes the access to their data, correction of inaccuracies.

39
00:08:32.350 --> 00:08:37.600
Jasmine Sharma: deletion of personal data, data, portability as well as that right to opt out.

40
00:08:37.600 --> 00:09:02.389
Jasmine Sharma: But there are a few things that make Tennessee stand out. So first, st it does set one of the highest applicability thresholds in the country, applying only to businesses that process data on 175,000 or more unique consumers annually. And so that's actually very high number compared to States like California or Virginia, which use a 100,000 consumer threshold.

41
00:09:02.390 --> 00:09:25.680
Jasmine Sharma: And then, secondly, Tennessee includes 2 business friendly provisions that are worth flagging. So one, it sets that entity level exemption for insurance companies, and, unlike most state laws that only exempt specific types of data. I'm thinking, like health or financial data. Tennessee actually fully exempts those State licensed insurance companies.

42
00:09:25.680 --> 00:09:43.659
Jasmine Sharma: So what does that mean? Exactly? Well, if you're a State Licensed Insurance Company, you're completely exempt from the law and not just parts of it. And that's sort of a big deal, because it's the 1st time we've seen a full exemption at the company level written into one of these laws.

43
00:09:43.700 --> 00:10:12.430
Jasmine Sharma: And then, secondly, what makes Tennessee business friendly is that Tennessee does offer that generous 60 day cure period with no sunset clause. So that means businesses notified of a violation by the State Attorney General do have around 60 days to like fix that issue before actually facing enforcement. And that's 1 of the longest cure periods in the country, second only to Iowa, which offers 90 days.

44
00:10:12.430 --> 00:10:16.149
Jasmine Sharma: And here's also the kicker. There is no sunset clause.

45
00:10:16.150 --> 00:10:32.140
Jasmine Sharma: so compare that to laws like California, Ccpa, where cure periods were temporary, and then they eventually had phased out, so companies really do get that 60 day buffer indefinitely, which makes this law especially business friendly.

46
00:10:32.630 --> 00:10:47.609
Jasmine Sharma: alright. And now you see that 1st bullet that we have here the NIST defense. So I'm sure many of you have already heard this kind of making headlines. Tennessee is the 1st State to include an affirmative defense tied to the NIST privacy framework.

47
00:10:47.650 --> 00:11:14.519
Jasmine Sharma: So what does this mean in practice? If your company adopts a privacy program that aligns with the NIST privacy framework. And then you're later accused of violating that law. You can use your NIST compliant program as a legal defense. But this doesn't make you immune from enforcement. It does mean that you can demonstrate that alignment with a recognized privacy standard that can greatly reduce your legal risk.

48
00:11:14.630 --> 00:11:41.699
Jasmine Sharma: So this in part does really raise a bigger question about the direction privacy programs are heading. And as someone who has practiced in security, Dwight, I'd love to sort of get your thoughts on this provision sort of what do you think now, with Tennessee offering this affirmative defense based on misalignment? Do you think we'll start seeing more privacy programs formally benchmarked against NIST. Just kind of what are your thoughts here?

49
00:11:43.880 --> 00:11:50.020
Dwight Turner: It's a great question. I I think it might be a little bit too early to tell. We do see. We have seen that

50
00:11:50.140 --> 00:12:01.090
Dwight Turner: Tennessee has kind of done this previously. Within this. Csf, the cyber security framework with sort of mixed mixed reviews. You know it's great, for

51
00:12:01.450 --> 00:12:09.359
Dwight Turner: legislators is great for businesses. I think it would be included on that list of things that makes it a a business friendly law.

52
00:12:09.770 --> 00:12:15.430
Dwight Turner: But whether how much that benefits consumers, I think, might be in question.

53
00:12:16.770 --> 00:12:37.900
Dwight Turner: so we can maybe discuss that a little bit at towards the end. But I do think it is great in terms of pushing up the the NIST privacy framework which, if anybody's looked at it, compared to a lot of other NIST documents, it is, pretty straightforward, easy to understand. You can pick up the controls and kind of get an idea of what it's about fairly quickly.

54
00:12:38.040 --> 00:12:53.519
Dwight Turner: And and it's recently been updated. So it includes some considerations for like AI governance, for example. So it's it's still a little bit early to tell. But it could. It could be positive for for businesses.

55
00:12:54.440 --> 00:13:08.330
Jasmine Sharma: Yeah, that's a great point, Dwight. Thanks so much for breaking that down. I agree like, this is a big signal, that benchmarking your privacy program, such as maintaining a privacy policy compliant against, you know, widely accepted standards like NIST

56
00:13:08.330 --> 00:13:32.899
Jasmine Sharma: might become more, maybe, of a trend. Maybe in the future we could see, and I know for a lot of teams right now, especially those newer to NIST. It can feel a little bit overwhelming, figuring out where to start. You sort of mentioned a little bit in your answer in terms of where they could start. But any more advice here for teams that haven't really explored NIST yet. And kind of where would you recommend that they do begin.

57
00:13:34.700 --> 00:13:54.690
Dwight Turner: I think they are actually still taking comments about the the current, the latest version of it. So if you do pull up in this website, you can have a look at the latest version. You can even go back and look at the 1.0 version and see what you think, especially if you have complaints or suggestions. I think a lot of us found that

58
00:13:54.710 --> 00:14:07.589
Dwight Turner: it wasn't maybe necessarily comprehensive enough, or didn't really consider like I mentioned before, like AI governance, or some of these fast moving challenges that we that we sort of have at the moment.

59
00:14:07.630 --> 00:14:16.859
Dwight Turner: So, being able to comment and give feedback on on that, I think, is helpful. But even if you're just going in, I I would just encourage you to

60
00:14:17.130 --> 00:14:36.770
Dwight Turner: like, I said, scan the categories, you know, they kind of put it into categories that are so sort of easy to remember, identify, govern control, like like those type of easy to point to things, and and, you know, give some consideration to what your practices are at the moment.

61
00:14:37.242 --> 00:14:43.230
Dwight Turner: Especially if you don't have. If you don't currently have a framework. I think it can be a good place to start.

62
00:14:44.500 --> 00:15:14.370
Jasmine Sharma: Yeah, definitely, that is great advice. Thanks, especially when it comes to sort of thinking about. Maybe those early steps or kind of quick wins that can help make it all more manageable. Let's go ahead now and move over to Minnesota. There's a lot to unpack here. So this law actually does take effect on July 31st of this year, and then compared to other State laws, this one does raise the bar in a few important areas, especially around profiling transparency as well as internal governance requirements.

63
00:15:14.370 --> 00:15:31.760
Jasmine Sharma: And I'd go so far as to say that this is one of the more stricter privacy laws that are taking effect this year. There's a lot of moving pieces here, but don't worry. We'll go ahead and get through it together. So let's start with what kind of makes Minnesota similar to the other State laws?

64
00:15:31.820 --> 00:15:46.899
Jasmine Sharma: First, st Minnesota does grant the standard rights that we have here, access to data, correct inaccuracies, delete personal information, portability to data as well as opting out. So that's kind of familiar territory for many of us.

65
00:15:46.900 --> 00:16:04.730
Jasmine Sharma: But here's where Minnesota kind of sets itself apart. So, as you can see here, we have listed profiling rights. Minnesota is the 1st State to give individuals the right to challenge decisions made through automated profiling that have legal or similar effects.

66
00:16:04.770 --> 00:16:15.830
Jasmine Sharma: So think about maybe applying for loans, housing, or access to these types of services, and I'll break this down more in the next slide, but it goes way beyond just opting out.

67
00:16:16.290 --> 00:16:21.470
Jasmine Sharma: And then number 2. There are limitations on Dsr disclosures.

68
00:16:22.040 --> 00:16:44.689
Jasmine Sharma: So when someone requests their data, Minnesota does limit what actually can be shared back, especially sensitive information, like social security numbers, financial account numbers, passwords, and so on. And so I'll also walk through that more in detail shortly. But 1st I do want to call out a few governance requirements that make this law for sure stand out. So

69
00:16:44.690 --> 00:16:56.020
Jasmine Sharma: Minnesota moves beyond that surface level privacy and really gets into that operational core. It requires businesses to 1st maintain a live data inventory.

70
00:16:56.020 --> 00:17:24.520
Jasmine Sharma: This is a 1st among us States. And maybe some of you might be thinking, that's not shocking, because Gdpr already expects this. But here's the thing in most us State laws. So far a data inventory has been more of a best practice, whereas Minnesota actually makes this mandatory. So this does signal a big shift from privacy being a legal checkbox to it, becoming more of a core, operational responsibility overall.

71
00:17:24.589 --> 00:17:53.819
Jasmine Sharma: And in addition, the law does require businesses to really document those internal privacy policies and then compliance procedures, and I also want to call out another standout that we have listed here, too. So Minnesota is one of the only States to require a designated chief privacy officer. So if your organization's privacy responsibilities are currently shared across teams, now is the time to really assign that clear ownership.

72
00:17:53.890 --> 00:18:23.669
Jasmine Sharma: And then, finally, a quick note on the 3rd party, transparency, Minnesota, does require businesses to provide on request either a list of specific 3rd parties. Personal data has been shared with or at a minimum a list of those categories of 3rd parties. So a lot of organizations may not have this information as easy on hand. So it's really worth checking your vendor agreement or data sharing contracts to really get ahead of that provision.

73
00:18:23.700 --> 00:18:34.619
Jasmine Sharma: Also, if you're thinking you might have heard of this 3rd party right before. Well, you are right, because the Minnesota provision is just like the 3rd party right in the Oregon consumer privacy act.

74
00:18:35.410 --> 00:19:03.639
Jasmine Sharma: All right. So let's go ahead. I told you to come back to those profiling rights. I just want to underscore Minnesota's profiling rights are the most detailed we've seen in any Us. State law. So far so under this law, if your organization uses automated decision making, for example, to decide who qualifies, maybe for a loan housing or medical treatment. Well, Minnesota does give people that right to challenge those decisions.

75
00:19:03.640 --> 00:19:16.900
Jasmine Sharma: So we kind of have it broken down here on the slide. But let me go ahead and walk you through what that kind of includes. So first, st the consumer can request access to the personal data that was used.

76
00:19:16.900 --> 00:19:41.130
Jasmine Sharma: and then the consumer can fix, you know, correct anything that's wrong there, as well as receive an explanation of how that decision was actually made, including what factors maybe played the biggest role, and then they can find out what they could have changed to get a different result. So essentially, the consumer can appeal that decision, and then ask for a human to take another look at it.

77
00:19:41.180 --> 00:20:00.820
Jasmine Sharma: This is, I would say, the 1st Us. State law in general to really offer this level of control over profiling. So if your systems really do rely on, you know, AI or algorithms to make those important calls, Minnesota is saying, you can't just automate. You do need to explain.

78
00:20:01.190 --> 00:20:12.329
Jasmine Sharma: And then going forward, I do want to also shift to something practical, but yet important, a little bit more on the redaction requirements when fulfilling those data subject requests.

79
00:20:12.400 --> 00:20:38.509
Jasmine Sharma: So under Minnesota's privacy law. And in most States, when someone submits that Dsr, let's say a request to access their data. You are expected to verify the identity and provide that information securely. But Minnesota, of course, here adds a new wrinkle when responding to Dsrs. You must not disclose certain sensitive data, even back to the person that it belongs to.

80
00:20:38.550 --> 00:21:06.970
Jasmine Sharma: Instead, you need to confirm whether that data was collected without actually sharing the value itself. So let me give you some examples of what should be withheld or redacted. And we have them listed actually here on the slide, too. So we're looking at social security numbers. Government issued ids like driver's license numbers, financial account numbers, potential passwords, security questions as well as biometric data.

81
00:21:07.810 --> 00:21:33.329
Jasmine Sharma: Here's what that would look like in practice. So say a consumer requests all their data, and your system includes their social security number. Instead of replying back that we have your social security number and explicitly showing that number to them you respond with something like, we collected your Ssn. But cannot disclose it for security reasons so kind of with that response, it really strikes that balance

82
00:21:33.330 --> 00:21:41.480
Jasmine Sharma: between transparency and security so essentially kind of going forward. This means your Dsr fulfillment process.

83
00:21:41.560 --> 00:21:57.160
Jasmine Sharma: We'll need to most likely flag and redact these fields automatically and potentially train privacy teams and vendors handling Dsrs on these types of rules. And ideally, you could template the wording used in these confirmations as well as I explained

84
00:21:58.260 --> 00:22:23.829
Jasmine Sharma: now that we've kind of covered those key operational pieces from mandated data inventories to new redaction rules and profiling lights. I do want to bring in our expert guests to kind of share some of that practical guidance. So let me start with a question. I think that many people are sort of thinking about is the average privacy team ready for the unique requirements of Minnesota's law today

85
00:22:23.830 --> 00:22:35.430
Jasmine Sharma: and kind of where do you think teams are most likely to struggle? Is it around internal governance, technical implementation, or just something else would love to hear what our experts have to say on this.

86
00:22:37.840 --> 00:23:07.240
Monique Altman: I can kick it off. So generally I would say, no. The average privacy team is likely not fully ready for Minnesota's law today, but while many organizations, I think, have gained experience with the Ccpa or Gdpr. Since this law does go beyond some of these things that we're accustomed to in specific areas. I think the most prepared teams will be those that have already

87
00:23:07.430 --> 00:23:26.870
Monique Altman: embraced a comprehensive risk based privacy program as opposed to just a compliance checklist which was touched on earlier. I think where teams may struggle. I'll just touch on this quickly. So the other panelists have time to chime in. But initially, I would say, the profiling oversight and transparency piece

88
00:23:28.530 --> 00:23:48.430
Monique Altman: In my view, this demands deep AI automated decision making system understanding. If indeed, you're using algorithms and AI to make the decisions that you touched on jasmine. And you also need to have clear consumer facing explanations

89
00:23:48.430 --> 00:24:05.569
Monique Altman: for those decisions as it pertains to what was the other. One. Sensitive data access limits. Minnesota's law is the 1st State law to explicitly exempt these certain sensitive data elements, Ssns, driver's license biometrics.

90
00:24:05.690 --> 00:24:07.020
Monique Altman: And so

91
00:24:07.480 --> 00:24:23.349
Monique Altman: the businesses are going to have to be prepared to describe the type of sensitive data collected, not the specifics redaction, as you mentioned, and balancing that transparency with the high sensitivity of these sorts of bits of information.

92
00:24:23.620 --> 00:24:45.969
Monique Altman: And then I guess the next one would be 3rd party data sharing disclosure. So like Oregon's Law, Minnesota grants consumers the right to request a list of 3rd party data recipients. But this is going to be, in my view, a significant logistical challenge for businesses with really complex data, sharing ecosystems. For example, in advertising

93
00:24:46.394 --> 00:24:55.729
Monique Altman: or even in a lot of technology companies where you're sharing with 3rd and nth parties to help you provide your services.

94
00:24:56.990 --> 00:25:01.799
Monique Altman: So those are just so, my, some of my thoughts on what the 1st bumps might be.

95
00:25:01.980 --> 00:25:13.119
Monique Altman: and I think if you keep those in mind along with any other advice that's provided today, it's something to just keep in mind on your list of what you need to prepare for in the next month or so.

96
00:25:17.770 --> 00:25:25.510
Jasmine Sharma: Yeah, that sounds really good, Dwight. Julie, if you'd like to add into. Otherwise I can kind of go on to the next question. But I'll just take a quick pause, just in case

97
00:25:28.630 --> 00:25:46.159
Jasmine Sharma: okay, perfect. Then. So now that we kind of talked about data inventories, I kind of want to take a moment here and kind of think a little bit for teams that are building up their 1st comprehensive data map. How would you recommend? They approach this exercise efficiently kind of without getting lost in the weeds. Here.

98
00:25:51.260 --> 00:26:13.744
Julie Cantor: I'm happy to jump in here. I think the most important thing, if you've never done a data map before is to understand who the key players are on your team and who is likely to be adopting technology who is making the decisions as to what gets entered into the tech stack and making sure that you are very, very close with that person. Because,

99
00:26:14.080 --> 00:26:28.409
Julie Cantor: a live data inventory can be really, really challenging. If you're not doing any type of automated scan. And the reason for that is, you may think that you know exactly what your company is doing in terms of

100
00:26:28.670 --> 00:26:33.130
Julie Cantor: apps that incorporate personal information. But

101
00:26:33.350 --> 00:26:43.527
Julie Cantor: every single employee in every single workplace where I've ever worked is looking to optimize their workflows to the greatest extent possible.

102
00:26:44.270 --> 00:27:09.099
Julie Cantor: and therefore their attention will probably be drawn to the many, many different tools out there that you can just click to accept, and then maybe even have for free, or have for a free trial, and then click all of a sudden, a person who you may not have considered a tech adopter in your company is now taking in a new company platform that can

103
00:27:09.790 --> 00:27:18.610
Julie Cantor: that needs to be in your date map. So, staying on top of understanding what your teams are doing. Who's using what to do? What is the very 1st

104
00:27:18.740 --> 00:27:40.010
Julie Cantor: building block? And it's something that you need to continuously revisit every time you're having your one on one with the leader of the product team. Have you adopted any new tools? Have you turned off any tools? And that's super important using a tool like data grill can help you stay on top of the automated changes that are happening in your tech stack a little bit better. But manually it's really tricky.

105
00:27:41.970 --> 00:28:09.699
Monique Altman: And I would like to just quickly add, so if this is the 1st time that you are doing a data map, just remember that efficiency is key plus all those things that Julie just talked about. But, I would say, define a clear scope first.st Don't try to map everything at once. Just clearly state why you're mapping it and get an understanding of that. Engage your key stakeholders, prioritize high risk, high volume data, sensitive data.

106
00:28:09.700 --> 00:28:36.840
Monique Altman: And then just my last little bit answer these core questions. When you're going through this data mapping process, what personal data is collected? Why is it processed? How is it collected and processed? What systems are used? Where is it stored? Who has access internally, who has access as a 3rd or nth party? How long is it retained. Where does it flow? Your data transfers? So those are some things to keep in mind. And if you can keep the scope

107
00:28:37.328 --> 00:28:51.480
Monique Altman: so something that you can define early on, you'll be in good shape, because it's a living document, and it's always going to expand and change. And as long as you can get that in mind that helps you to get over that 1st initial step of creating it.

108
00:28:54.160 --> 00:28:57.185
Jasmine Sharma: Love that. Thank you. Oh, Dwight.

109
00:28:57.690 --> 00:29:03.460
Dwight Turner: Sorry just to tack on a little little bit there, because I also saw the the question in the chat. I would just add.

110
00:29:03.760 --> 00:29:09.990
Dwight Turner: add in like, if if you're the only person on your privacy team, or the only one responsible for the

111
00:29:10.080 --> 00:29:18.860
Dwight Turner: for the data map, do whatever you can to pull some other folks in. If you happen to have a sort of like data committee or

112
00:29:19.245 --> 00:29:47.389
Dwight Turner: if there are data engineers, data scientists, anyone that's responsible for any sort of data governance or particular types of of of data, including maybe responsibility for certain vendors. Anybody that you can kind of get on a meeting to kind of go through, because you don't want to be going through this list of a thousand vendors or all these data types by yourself and trying to figure out

113
00:29:47.510 --> 00:30:00.129
Dwight Turner: you know what what goes where and what what are we doing with that? And what's the legal basis for for all of these things by yourself. So that would be my my only warning, especially if you're trying to get compliant quickly.

114
00:30:00.587 --> 00:30:18.160
Dwight Turner: To, you know, even if you're the person, the point person for it, do what you can to reach out to your team, you know, you know, get support, executive support when possible, but reach out to some other resources you might have at your company to, to decentralize when possible.

115
00:30:19.020 --> 00:30:38.219
Dwight Turner: and oh! And technical solutions so like, if you already have, like a a Dlp system, which is a data loss prevention, that is, for example, scanning your data at risk and can pick up on data labels and things like that that can also help to to figure out what you know, what kind of data you're managing.

116
00:30:40.810 --> 00:31:10.349
Jasmine Sharma: Awesome. Thank you for sharing. I'm honestly great advice all around, and I love the idea of that checklist, too. So, thanks so much, I think, overall, a lot of teams are still in that mindset of just getting through core compliance, and Minnesota is definitely asking for a little bit. Something more deeper and more structured here going forward. Now, my last question regarding Minnesota really does relate to something we're sort of keeping a close eye on, which is how automated decision making appeals will play out in practice.

117
00:31:10.350 --> 00:31:28.109
Jasmine Sharma: So, Julie, I'd kind of love your take on this. How should companies think about profiling and the automated decision-making piece. Under this law, especially given the various carve outs that we see, such as employment data, hipaa covered health data and just what practical steps. Should they be talking about now.

118
00:31:29.350 --> 00:31:52.610
Julie Cantor: Yeah. So that's a great question, because the Minnesota law does regulate profiling and create a unique set of new steps that no other State has but within a narrower scope than people might expect, and I think that there are some contradictions in the law that we are going to see play out as enforcement begins. So it's something that we will all need to continuously monitor.

119
00:31:52.610 --> 00:32:15.690
Julie Cantor: And if folks on the call have either in-house counsel that are focused on this or outside counsel or privacy team. It's something that I think we'll have to see how this plays out, and the contradiction that I'm calling out is that in the profiling section it covers decisions that produce legal or similarly significant effects concerning the consumer, and it means decisions made by the Controller that result in the provision

120
00:32:15.690 --> 00:32:26.220
Julie Cantor: or denial of the controller of financial or lending services, housing, insurance, education, enrollment, criminal justice, employment, opportunities, healthcare services, or access to essential goods and services

121
00:32:26.270 --> 00:32:51.759
Julie Cantor: that's very broad and encompassing. However, if you look carefully at the exceptions under the law, employment and job application data are carved out, as is data covered by the Fair Credit Reporting Act, which is going to intersect with the financial information pieces and as well as Glba. When we think about medical decision making hipaa protected data is also carved out. So I think

122
00:32:51.980 --> 00:33:02.733
Julie Cantor: you're gonna have to work really carefully to figure out if the decision making that your tools are doing or engaging in are actually considered automated decision making under this role.

123
00:33:03.300 --> 00:33:04.180
Julie Cantor: but

124
00:33:04.300 --> 00:33:27.749
Julie Cantor: if if it is possible that that is happening, you do need to be very careful about it, and so think through. Is your company making eligibility decisions. A person who is potentially eligible for certain perks or certain preferential pricing for a product. Those could potentially be considered processes that would be implicated under this section.

125
00:33:27.780 --> 00:33:42.440
Julie Cantor: And so I think, Monique put it, well, you're gonna wanna make sure that your inventory where profiling occurs, making sure that you're training your internal teams, obviously building your opt out and contestation mechanisms which I think are are gonna be

126
00:33:42.970 --> 00:34:01.679
Julie Cantor: from the ground up in most cases. And then finally, just make sure you're thinking through edge cases. Are you collecting data through smart devices? Are you using like a loyalty app connection, that type of thing? And so even innocuous profiling might have meaningful effects that could trigger rights here.

127
00:34:03.910 --> 00:34:26.909
Jasmine Sharma: Yes, thanks so much, really, for sort of making that connection altogether overall. It's definitely clear that Minnesota's law is expanding rights, especially with offering that additional right with respect to profiling. So thank you so much for sharing on that. I do kind of want to shift some gears. Now we've covered the Us. State privacy laws. Now let's kind of go into the EU AI act.

128
00:34:27.179 --> 00:34:37.990
Jasmine Sharma: There's been a lot of buzz around this regulation lately, and just to recap the purpose of the EU AI act is really to improve how the internal market functions

129
00:34:37.989 --> 00:34:59.830
Jasmine Sharma: to promote a human-centric and trustworthy approach to AI, and also ensure that high level of protection against harmful AI effects while supporting innovation. So the AI Act officially entered into force last summer, back in August of 2024, but its provisions are rolling out gradually over that 3 year, timeline.

130
00:34:59.830 --> 00:35:23.220
Jasmine Sharma: and, as you can see here on the slide, we've mapped out the key dates to watch. So some provisions already took effect. This February, notably, we had that prohibition on certain AI practices like using AI to detect or predict people's emotions in the workplace. And then new transparency requirements telling users when they're actually interacting with those AI systems.

131
00:35:23.260 --> 00:35:37.500
Jasmine Sharma: And then next up this August, we'll have EU Member States. They're required to actually designate independent, notified bodies to assess the conformity of those high risk AI systems before they can actually be placed onto the market.

132
00:35:37.500 --> 00:35:53.820
Jasmine Sharma: and then looking ahead, there are further obligations that are scheduled for August of next year, of 2026, and then onwards to 2027 that are related to that transparency and compliance for different categories of those high risk AI systems.

133
00:35:53.820 --> 00:36:18.109
Jasmine Sharma: However, I'm not sure if you've seen in the news, but there is some uncertainty right now about enforcement timing so due to some stakeholder concerns, some requirements like the code of practice for general purpose. AI models have been delayed beyond that original target date. So businesses should definitely stay flexible in this case, and then also keep an eye on those sort of updates moving forward.

134
00:36:18.210 --> 00:36:37.320
Jasmine Sharma: I do want to open this topic a little bit to our panel here, and then pose a question with so many companies looking to AI for answers on how their brand can be different and scale. How can privacy teams really balance that excitement while being mindful of this evolving regulation in this area.

135
00:36:41.740 --> 00:36:52.430
Julie Cantor: Yeah, I mean, I think that is the core question, that not only companies, but the national frameworks are are looking to address. And like.

136
00:36:52.890 --> 00:36:56.729
Julie Cantor: it's a very exciting evolving area.

137
00:36:56.910 --> 00:36:59.540
Julie Cantor: AI, that is. And

138
00:36:59.660 --> 00:37:11.379
Julie Cantor: this there's this fundamental question of, do you allow maximum flexibility without regulation? Or are you treading a little bit more carefully, and I think that this

139
00:37:12.350 --> 00:37:38.169
Julie Cantor: background and framework is important to understand when you're thinking about how to meaningfully advise your company on compliance. One call out on the pause is that the timeline is a little bit unclear, and part of the reason for that, as I understand it, is that the EU is seeing action in the United States in terms of AI regulation and a potential pause there.

140
00:37:38.598 --> 00:37:42.881
Julie Cantor: And not wanting to fall behind. And it's almost like

141
00:37:43.680 --> 00:37:51.309
Julie Cantor: it's like a larger example of the things that is happening on the micro ground in our companies like and by that I mean

142
00:37:51.620 --> 00:37:57.539
Julie Cantor: the the EU itself is asking, do we want to fall behind by regulating

143
00:37:57.660 --> 00:38:12.410
Julie Cantor: heavily compared to the Us. Or do we want to remain a bit more competitive. So that, I think, is part of of the issue. In the Us. There's an open question. There is the House, the House version of the Budget Bill that

144
00:38:12.640 --> 00:38:34.179
Julie Cantor: potentially would result in a 10 year moratorium on State level AI regulation. The Senate version modified it a bit. It might get altered further, and then my understanding is that when it comes back to the house there is going to be more pressure from the house freedom caucus, because the way that it is drafted potentially.

145
00:38:34.766 --> 00:38:39.053
Julie Cantor: Runs afoul of, you know, traditional Federalism values. So

146
00:38:39.920 --> 00:38:44.989
Julie Cantor: so that is all kind of the regulatory background. The biggest

147
00:38:45.250 --> 00:38:58.930
Julie Cantor: advice that I have on the EU AI Act, for companies like the ones that I have advised is, don't assume that because it has EU in the name that it doesn't apply to you. I think many companies just assume.

148
00:38:59.350 --> 00:39:01.879
Julie Cantor: and you get just general high level

149
00:39:02.070 --> 00:39:26.319
Julie Cantor: overviews on it. And you're like great. My company is not focused on the EU. We only do us stuff, not a problem. And then you kind of just like close your eyes to it. But this law explicitly has extra territorial reach. So if your system AI system is used in the EU, even if you're not intending to do business in the EU, you may be on the hook. It applies to any organization established or located in the EU

150
00:39:26.320 --> 00:39:39.590
Julie Cantor: that uses AI systems, which is broadly so to give specific examples. Let's say you're a Us. Online clothing retailer using AI for product recommendations. If EU consumers can access your site

151
00:39:39.590 --> 00:39:52.559
Julie Cantor: and see those recommendations that could possibly be considered a regulated use. Another example is a Us. Startup providing chat bots that might be used in the EU. Same thing could be regulated as a provider so

152
00:39:52.790 --> 00:39:57.349
Julie Cantor: got to be extra careful there, and when in doubt, make sure to get

153
00:39:57.530 --> 00:40:03.909
Julie Cantor: competent advice as to the things that you are doing versus just making assumptions and then ignoring the law.

154
00:40:07.630 --> 00:40:26.140
Dwight Turner: Yeah, I would. I would piggyback on that, you know. You could look and say like, Oh, it has. It's for the EU. It's not for us, but you could also look at it and say, like, I'm not a high risk. Use case and also put it off. So I would just encourage you to be proactive. Or

155
00:40:26.300 --> 00:40:28.283
Dwight Turner: what do we say like

156
00:40:29.280 --> 00:40:35.789
Dwight Turner: the barbecue smells nice until you're you realize you're cooking right like, you know. Don't. Don't wait until the stuff

157
00:40:36.010 --> 00:40:38.069
Dwight Turner: is is really

158
00:40:40.110 --> 00:40:48.710
Dwight Turner: you know, at your front door, you know. Be proactive about about some of those things. Understand? Like even in what would be considered a lower, low risk.

159
00:40:49.401 --> 00:40:58.799
Dwight Turner: Ai, use what you need to what you need to be doing, what type of transparency you need to have what type of pre deployment

160
00:40:59.600 --> 00:41:23.189
Dwight Turner: notices and things you need. You need to provide or pre-processing notices that you need to provide. What type of assessments you need to be, you need to be doing. Are you doing a AI impact assessment? Are you doing? If you're saying you're a low risk system. Are you doing an assessment that you know can prove to a auditor or regulator that it is a it is a low risk system. So

161
00:41:23.595 --> 00:41:30.380
Dwight Turner: those are, you know. Even if you think you're not in scope scope, it is worth having the the discussion with your

162
00:41:30.570 --> 00:41:48.819
Dwight Turner: privacy team, compliance team, legal team, and you know, or AI committees about what it would, what it would look like. If our laws in the Us. Are influenced by the EU AI act, or if your company itself may have some compliance, obligations from from the law.

163
00:41:52.030 --> 00:42:16.810
Jasmine Sharma: Sounds really great. That's so right. Thank you so much for sharing Julie and Dwight. We're sort of in that age of AI. But we need to also remember that privacy teams have to sort of navigate that fine line between innovation as well as protecting people's rights. So it's all about Building trust from the ground up. And I do appreciate, Dwight, you mentioning that early involvement is key privacy. Teams do need to be part of that AI project discussion from the start.

164
00:42:16.810 --> 00:42:41.359
Jasmine Sharma: All right. Now that we've covered that big picture, I want to go ahead and take a moment to do. Look ahead at what's coming for the remainder of the year. So in September the EU Data Act enters force. Then October, we'll see Maryland's new privacy law go into effect. October will also bring amendments to Montana's privacy, Act, particularly around that genetic information.

165
00:42:41.360 --> 00:42:50.499
Jasmine Sharma: And then Montana will also adopt the redaction of certain sensitive data in Dsars, much similar to the Minnesota privacy law that we just spoke on

166
00:42:50.560 --> 00:43:18.450
Jasmine Sharma: then. Later this year in December, the mandatory right to cure period expires in both Delaware and New Hampshire. So this means regulators in those States will be able to take enforcement action without offering companies that chance to fix the issue 1st and then, finally, state privacy is not slowing down for 2026. We're closing the year with 3 more State laws going into effect. So we'll see Indiana, Kentucky and Rhode Island.

167
00:43:18.450 --> 00:43:23.610
Jasmine Sharma: So it's really safe to say that the privacy train isn't stopping anytime soon.

168
00:43:23.760 --> 00:43:39.700
Jasmine Sharma: And all right now, now that we have sort of the facts down. I do want to open up now to our panel discussion, so I would love to dedicate this next portion of the webinar to our discussion, and bring in all our future speakers again to share their insights.

169
00:43:40.730 --> 00:44:02.739
Jasmine Sharma: But before we continue, you'll just see a quick poll that's popped up on your screen. This is completely optional. And just for those who may be interested in receiving a follow up from our team, if that's not, you feel free to simply disregard it, and thank you in advance for those who do. Take that moment to respond to the poll. But let's go ahead and dive into our 1st question for our panelists.

170
00:44:02.740 --> 00:44:14.510
Jasmine Sharma: What do you think will be the biggest privacy compliance hurdle by the end of 2025? And do you think we're underestimating it right now would love to hear what you all have to share here.

171
00:44:15.683 --> 00:44:23.810
Monique Altman: I'll kick it off so I think the biggest hurdle by the end of the year will probably, or likely be navigating

172
00:44:23.830 --> 00:44:48.070
Monique Altman: just the rapidity of all of these evolving State law, State privacy laws, and emerging AI related privacy regulations. I am concerned that many businesses are currently underestimating the complexity of this. What I call it used to be patchwork. Now I call it fragmented landscape, and the deep technical requirements of AI governance

173
00:44:48.120 --> 00:44:59.089
Monique Altman: equally significant, I think, is perhaps maybe even more underestimated, is managing privacy, risk across complex and extended supply chains.

174
00:44:59.210 --> 00:45:04.919
Monique Altman: So the sheer number of 3rd parties and nth parties, as I call them, handling personal data.

175
00:45:04.990 --> 00:45:22.860
Monique Altman: coupled with a variety of security standards. And all of these compliance expectations across jurisdictions create significant blind spots, I believe, and amplifies liability potentially, that many organizations are not adequately prepared for. And this goes right

176
00:45:22.900 --> 00:45:40.840
Monique Altman: back to what Dwight and Julie were speaking of. Do not let this. Come to your door first.st Start thinking about these things now, even if you think it's really not applicable to you, because, as you all know, enforcement is the big word this year, too. So

177
00:45:41.000 --> 00:45:45.809
Monique Altman: just get ready. It's about all I can say. Just get ready and be prepared.

178
00:45:53.190 --> 00:45:57.580
Jasmine Sharma: Alright Dwight, did you want to share maybe a little bit of your perspective here, too.

179
00:45:58.399 --> 00:46:16.740
Dwight Turner: Yeah, I was just, gonna you know, kind of reemphasize some things that we we said earlier about. You know, being proactive, I think there is a lot of uncertainty like you have. What's happening? In the Federal space versus what's happened with the State laws? You have, you know,

180
00:46:17.420 --> 00:46:23.550
Dwight Turner: The Euai act, looking back at you know how they can do some things different. And things like that. So you could.

181
00:46:23.760 --> 00:46:36.339
Dwight Turner: you know, get get really comfortable, so I would just really encourage everyone to to be proactive at the very least. Have some of those conversations. Now look at some of those frameworks. Now look at what it would take to implement

182
00:46:36.510 --> 00:46:44.070
Dwight Turner: some of those things now and then, I would say, the the paperwork is a is, you know, uphill. So

183
00:46:45.184 --> 00:46:53.185
Dwight Turner: you know, if you already having trouble responding to access requests, if you're already having trouble, you know

184
00:46:53.880 --> 00:46:55.979
Dwight Turner: managing, you know.

185
00:46:56.230 --> 00:47:22.039
Dwight Turner: the vendor reviews access reviews, you know, with your security teams like you will have trouble with all of you know, some of the AI governance requirements. We talk about conformity assessments, funnel, fundamental rights impact assessments. You know. A a couple of different flavors of your, you know, privacy impact assessments. So you know, getting your risk management.

186
00:47:22.200 --> 00:47:43.120
Dwight Turner: healthy and active and inform from a lot of different risk inputs would be something that you can do right now without any fancy tools or or anything. Well, just like a lot of like communication collaboration. I I would really like strongly encourage everyone to to make those moves.

187
00:47:45.400 --> 00:48:08.610
Jasmine Sharma: Yeah, awesome. You both mentioned such great points. I think the big takeaway here, we've learned is be proactive, not reactive in privacy. So now moving to our next question, and one that I personally feel is so relevant for this year have regulators, expectations or enforcement behaviors influenced, how your team presents privacy, risks to leadership.

188
00:48:12.220 --> 00:48:19.360
Julie Cantor: I'm happy to weigh in on this. And and I should say that I'm not currently working on an in-house privacy team. I'm I'm in

189
00:48:19.380 --> 00:48:47.740
Julie Cantor: outside counsel, as it were. But I will say that regulatory priorities, regulator priorities and enforcement priorities and enforcement budgets are very, very important when you think about what to prioritize in terms of compliance. Because if you know that California, for example, is putting a lot of money towards enforcement, then you want to make sure that California is at the top of your list in terms of ensuring compliance, and that

190
00:48:48.020 --> 00:49:03.750
Julie Cantor: is always going to be the case. You can have laws that are very, very aggressive. But if there's no money and no personnel behind enforcing the law, then it's going to be much less likely that your company will be targeted. That said.

191
00:49:03.810 --> 00:49:20.270
Julie Cantor: if you are a privacy minded person, and you believe in the importance of privacy compliance, which many of us on this call, do then you, you will still try to ensure that your company is complying with all of the laws to the greatest extent possible.

192
00:49:20.380 --> 00:49:22.957
Julie Cantor: But you need to be realistic as to

193
00:49:23.400 --> 00:49:41.320
Julie Cantor: what's actually going to happen, which boards are most likely to catch you doing something potentially violative of laws. And then I think something to call out is thought leadership that I've seen the CEO of data grill provide, which is that

194
00:49:41.720 --> 00:50:05.449
Julie Cantor: Regulators will often look at what is publicly facing on your website versus your internal data map first.st So if there are things that are on your site that make it appear that you will not allow folks to exercise the rights that you know them to have that is going to be potentially worse than you, not having the internal infrastructure to back it up, though you do want to make sure that you have that infrastructure as well.

195
00:50:13.080 --> 00:50:42.250
Jasmine Sharma: All right, I totally agree. I think this past year has really seen that high profile enforcement actions across privacy, I think, to name a few headlines. We've seen Texas reach that 1 billion settlement with Google. And then California. We saw those Regulators find Honda for 600,000 plus to settle those alleged Ccpa violations. So these cases are clear signals that enforcement is definitely ramping up, and that organizations do need to be paying close attention.

196
00:50:42.300 --> 00:50:57.920
Jasmine Sharma: Now that you've all sort of laid out these challenges for us. I'm wondering, do you think those privacy challenges maybe look different across industries, for instance, what should a privacy manager in retail be focusing on right now compared to someone working in Sas.

197
00:51:00.720 --> 00:51:12.349
Monique Altman: Well, we're a sas company, so I'll kind of jump in just quickly on that one. So it does differ significantly. So a privacy manager in retail should probably prioritize issues like

198
00:51:12.500 --> 00:51:37.690
Monique Altman: video surveillance loyalty programs, physical location data managing in store data collection and consumers rights requests given that they have such direct consumer interaction, a privacy manager and a Sas company such as Ping, we're going to be focusing on 3rd party vendor risk management due to data processing by numerous subprocessors.

199
00:51:37.800 --> 00:51:46.399
Monique Altman: shadow it, secure access controls, and then the privacy implications of AI integration. Within each of these services.

200
00:51:46.846 --> 00:52:12.829
Monique Altman: As the primary data. Interaction is often through digital platforms and extensive integration. So it is very dependent, I think, on the industry that you are in, and as Julie so eloquently put it, if you're not sure, reach out and get some advice on that, especially if you've got some company that's got a multiplicity of activity that's going on that doesn't fall into one category or another.

201
00:52:17.870 --> 00:52:25.039
Julie Cantor: I think that's dead on Monique. And and to that I'll add that in the retail context, again, speaking just

202
00:52:25.330 --> 00:52:29.528
Julie Cantor: as my own firm, not on behalf of any former employer.

203
00:52:30.350 --> 00:52:37.080
Julie Cantor: retailers have the ability, and there are tools out there that allow retailers to track

204
00:52:37.190 --> 00:52:43.368
Julie Cantor: so much more of consumer behavior than you realize could possibly even be tracked.

205
00:52:43.990 --> 00:52:58.759
Julie Cantor: you know, if you go to an e-commerce site, for example, you might not realize that it's not just what you're buying that is being tracked. It's it's where your cursor is going on the screen that is being tracked. It's

206
00:52:58.970 --> 00:53:11.210
Julie Cantor: in some places not everywhere. It is what you are typing into search bars. Even if you don't click, send on this or don't don't click. Enter on the search bar so I think

207
00:53:11.480 --> 00:53:23.447
Julie Cantor: many consumers would be surprised if they, if they understood the extent to which what they are doing, can be monitored. And at some point there might be

208
00:53:24.030 --> 00:53:38.989
Julie Cantor: more of a headwind against that type of thing. But at present I think the technology is really outpacing people's understanding by a long shot. And so it's just something to continue continuously. Be mindful of as you support

209
00:53:39.230 --> 00:53:40.350
Julie Cantor: your companies.

210
00:53:42.600 --> 00:54:02.010
Jasmine Sharma: Yes, very well put, thank you. I think we have a question from the audience. I just wanted to pose before we leave off. Just kind of what are the more critical aspects of the AI regulations that you think companies, particularly smaller companies, are ignoring, or need to be made aware of.

211
00:54:12.740 --> 00:54:15.689
Dwight Turner: I guess I'll I'll kick that one off. I mean.

212
00:54:16.620 --> 00:54:21.729
Dwight Turner: There, you, I think one thing we ignore is maybe just how much.

213
00:54:21.990 --> 00:54:38.600
Dwight Turner: how much it is, how much it takes to implement AI governance. I think that's a that's a big thing. So getting started and getting all the input you need, I think is is important, but there's you know, so many facets to it that you might need, even if

214
00:54:38.730 --> 00:55:00.370
Dwight Turner: you know your team is not coding the model. Maybe you're using. You know, someone else's someone else's model like, you really have to understand your use case. You have to understand that particular vendor, you have to, you know, read their documentation, and understand what might be some limitations of of the model itself. What might be some limitations of the

215
00:55:00.440 --> 00:55:11.289
Dwight Turner: the way that you're actually using the model and figure out how to communicate that to your you know, to your customers like, do you have a mechanism to have

216
00:55:11.510 --> 00:55:14.060
Dwight Turner: humans in the loop to

217
00:55:14.770 --> 00:55:42.200
Dwight Turner: and, you know, jump in when something goes wrong. Are you able to, you know, stop processing or pause. You know. Do some type of pro pause when something goes wrong. Are you monitoring the types of input for you know, malicious acts, any type of poisoning, etc, and the outputs, for you know what might be, you know, sensitive data loss, or might be your.

218
00:55:42.320 --> 00:55:45.680
Dwight Turner: you know, AI agent saying something saying something crazy.

219
00:55:45.790 --> 00:55:54.359
Dwight Turner: Do you have what they call air system, an adverse impact reporting system where people can come to?

220
00:55:54.470 --> 00:56:20.640
Dwight Turner: You know your company and put in, put in this data that that says like, Hey, I had an experience with your AI agent that that wasn't so great, can I? You know? And then you're taking that and putting that in some type of feedback loop. So you're discussing it in your company. You're creating tickets for your engineers. You know you're you're trying to remediate those things where they can. Or maybe you're discussing

221
00:56:20.770 --> 00:56:25.759
Dwight Turner: what ethical issues some of those interactions impose. But

222
00:56:26.020 --> 00:56:32.069
Dwight Turner: you know, all of that is a I think, a heavy lift for for small companies. So you have to.

223
00:56:32.799 --> 00:56:36.589
Dwight Turner: you know, start early and work on these problems often.

224
00:56:39.140 --> 00:57:01.688
Julie Cantor: Totally agree with with Dwight's Point there, and I think it's a massive, a massive undertaking to truly understand what your company is doing in terms of AI especially if it is small. And you have, you know, a team of 5 to 10 people. It's it's a real heavy lift there because the technology is big, even if your company is is relatively small.

225
00:57:02.140 --> 00:57:08.125
Julie Cantor: And then to that I'll add, when you're an extra small company, often you will leverage

226
00:57:08.860 --> 00:57:21.809
Julie Cantor: you will leverage 3rd parties that may not be as mature. So don't assume that a tool that you're engaging has it all together with respect to AI compliance.

227
00:57:24.850 --> 00:57:41.860
Jasmine Sharma: Awesome. Thank you so much. Really, really great responses and a big thank you to our panelists, Monique Dwight and Julie, really for sharing all your invaluable expertise with us today. So now, before we close, let's go ahead and take a couple look at some helpful resources that we want to share with you.

228
00:57:41.860 --> 00:58:00.549
Jasmine Sharma: So first, st if you're interested in continuing privacy discussions and want to stay up to date on the latest privacy news, or even think about expanding your network. I really do encourage you to join our slack community privacy, basecamp. It's the go to space for valuable privacy, insights and connections.

229
00:58:00.550 --> 00:58:05.080
Jasmine Sharma: plus all 3 of our speakers on this call today are members. So do join.

230
00:58:05.080 --> 00:58:28.759
Jasmine Sharma: and then next, do keep an eye on your inbox for our guide to state privacy laws. This guide will be an essential resource for navigating all these State privacy laws and for these new requirements going into effect for July and for our datagrow customers. You can request changes to your policies from the defaults by emailing support at datagrow before June 24, th

231
00:58:28.760 --> 00:58:36.120
Jasmine Sharma: and for everyone else on the call, be sure to check your email later for instructions on how to claim that Cpe credit.

232
00:58:36.120 --> 00:59:00.690
Jasmine Sharma: Well, thank you all for joining us today. If you do still have questions or input of your own. I really do encourage you to join us next week for a follow up discussion hosted by privacy basecamp, our slack community. We'll have Monique Dwight and Julie returning us with a zoom this time for an open discussion. Non-recorded event. Anyone can join, contribute, or just listen in.

233
00:59:00.690 --> 00:59:11.430
Jasmine Sharma: so I'll go ahead and send that link for all you to sign up as well in this chat. And then we really do appreciate all your time and attention as we navigated through these privacy updates.

234
00:59:11.430 --> 00:59:32.240
Jasmine Sharma: I hope you all found today's session really valuable. Once again a huge, huge thank you. To our amazing speakers, Monique Dwight and Julie, for sharing all their insights, and there will be a quick survey at the end. If you could. Please take a moment just to fill out. Thanks again. Everyone. We really look forward to seeing you at our next event and have a great day ahead. Thank you.

expand_more Show all