Turn data security & compliance into a business advantage

Navigating LLMs challenges in data security & compliance
October 3, 2023
Getting started with Platform Engineering security and compliance
March 18, 2024

In this episode, Bart and Sylvain are joined by Evelyn de Souza, Head of Privacy and AI Governance at Oracle SaaS Cloud. We discuss how companies can turn security, privacy, and compliance into brand differentiators, enabling companies to achieve a quantifiable ROI.

The views and ideas expressed during this podcast are solely the author’s and not those of any affiliated company.

Summary of the conversation:

  1. Strategic Approach to Compliance: Turn data security and compliance into a business advantage by strategically aligning with harmonized frameworks, reducing complexity for engineers, and avoiding overwhelming stakeholders.
  2. Incident-driven Improvement: Use privacy incidents as opportunities not just for recovery but to revamp policies, processes, and interfaces, transforming challenges into enhancements that strengthen brand reputation.
  3. Future-proofing Through Design: Design services focus on user experience, fairness, and explainability, positioning privacy as a key differentiator that contributes to strong brand.
  4. Regulatory Collaboration: Collaborate with regulatory bodies proactively, providing feedback and participating in creating standards and frameworks to stay ahead and navigate the evolving regulatory landscape.
  5. AI Trust Building: Prioritize building trust in AI by addressing factors like explainability, fairness, and security, ensuring that AI applications align with privacy principles and contribute positively to the business landscape.
Read the transcript

0:00

Hello, welcome to the Data Defender Forum episode #5.

Today we are joined by Evelyn Desouza who is the Head of Privacy, AI, Compliance and Ethics for Oracle SAS Services.

One thing that’s very interesting is that often time compliance is seen something that’s more of a burden.

0:24

But Evelyn knows the magic and actually when I’ve I glanced at, I know Evelyn for a long time, but when I I glanced at our LinkedIn, what I liked and what I read is that you want to turn data compliance into a business advantage.

0:40

And so today in today this period we are going to talk about this, how are you?

Can you turn a data and security compliance into something that’s positive for your business.

And there is actually a lot of that.

Evelyn welcome on the on the Data Defender forum.

0:58

Thank you.

I’m excited to be here today and talking about a topic that I’m most passionate about.

The view and ideas expressed during this podcast are certainly the hotels one and not of any affiliated company.

Well, I.

1:13

You know, just for the sake of context, nowadays if you’re driving down, you know, the highway in major cities like San Francisco, you’ll see advertisements, billboards for privacy companies that perhaps we wouldn’t have seen, you know, two to three years ago.

What has changed over these last few years that’s made consumers more aware and and really driving the that privacy is becoming a critical factor?

1:39

You know, it’s so interesting.

Who would have thought?

Who would have thought that privacy would take centre stage on freeway advertisements two or three years ago?

You know, when I think back to the advent of the General Data Protection Regulation, GDPR back in 2018, which was an evolution of the former European privacy regulation, I never would have thought of the forcing function it would have had worldwide or the impact in two ways, not just in terms of regulation, but in terms of user sensibilities, how much those have changed.

2:23

So firstly, I’m going to talk about the regulation piece today.

Most of the world’s privacy regulation and there’s been roughly a 200% increase just over the last five years in terms of regulation.

2:39

It’s very GDPR centric or focused around key aspects or key tenants in there.

And along with that, there has been a huge drive to get consumers aware of their rights.

I’m going to use what happened in California as a key example.

3:00

You may have heard of Alastair Mctaggart.

He’s a real estate entrepreneur.

Amongst many things, he was the civilian who drove California’s latest privacy regulation.

What we know is the California Data Privacy Act or now the California Privacy Rights Act.

3:18

Along with it, you know there was a huge awareness campaign.

That is what has fundamentally changed is that users are savvy today they there’s a huge mistrust with corporations.

That divide has increased.

3:35

Often consumers feel like organizations are not clear and upfront with what they’re doing with their data.

And they.

And that’s consumers.

People like you and me are increasingly exercising our rights when it comes to privacy.

3:52

With that in mind, like you said, as users, consumers become more aware, more demanding, and want to get more informed.

When we hear topics like privacy by design, what are those?

What are the things that we should keep in mind?

What’s your definition?

You know, so I’m going to give a very sort of layman’s definition of privacy by design, because I really do believe that today, you know, there is an opportunity to make privacy so much more user centric and focused on giving users that sense of trust.

4:27

So I think it’s a balance.

You know, about organisations, you know, being able to innovate with data and doing so in a trusted way.

So designing their services in a way where users 1 have more transparency around what’s taking place with their data, how it’s being used.

4:46

And then secondly, these rights, you know, the right to edit, the right to be forgotten, the right to to delete all of these.

You know, if the more that they’re built into the design, the more that users are aware in terms of the policies that govern their data.

5:05

So it’s not like a 12 page policy that would be very difficult for users or somewhere embedded on page 14 or 15 of that policy is the actual crux of how the data is being used, you know, and and so privacy by design is really thinking about the users and driving privacy from the user perspective so they don’t have to, you know, troll through hidden menus to find those opportunities to invoke their rights.

5:36

And here’s the interesting thing.

When designed properly around end users, fewer tend to invoke the rights to be forgotten, their right to edit because they feel much more comfortable.

They feel like an organization is being upfront.

5:53

Yeah, like data privacy, like according to the UN, like more than 70% of all the countries in the world have some sort of data privacy laws in place.

And as you said, consumer are increasingly sensitive and demand that this type of rights are you know, being taken care of and so companies have kind of no way around them.

6:19

But why do you believe that some founder and you know, like sea leaders, are sceptical about the impact that privacy can have on the the organization and more more specifically on on the the profit line?

6:38

You know, it’s such an interesting thing the way that we quantify the benefits of different aspects of, you know, our business in cyber.

And I think one of the things in cybersecurity is you can say there was this many vulnerabilities that caused this impact.

When nothing happens in privacy, meaning that there hasn’t been a major violation or a data breach, it can be sometimes hard for CEOs and C executive leaders to see the impact that privacy is having on their business because nothing’s happening, You know, that’s a good thing.

7:12

So why invest in more over here?

And I think it’s upon us as privacy practitioners or those of us who champion privacy in organizations, to be able to present that in a much more business consumable way, the impacts of privacy or in a quantifiable way.

7:31

For example, we talked about just in the previous part about you know end user rights and those who are invoking.

You know one way would be to show with a a service that is not being designed in a way which is user centric or versus another.

7:48

And if they’re an apples to apple like in a similar space and to be able to show well, you know, you had this many users who were dropping out and that equated to, you know in loss of customers this many percent.

That’s a huge thing.

To be able to report back to a AC executive suite or to be able to show that as a result of designing, redesigning the service that you minimized that drop off would also be, you know, a a very business, quantifiable and a tangible benefit.

8:23

Yeah, for sure.

Like it’s about as you said, nothing might happen but just having the trust of your user or consumer might be hard to measure yet absolutely necessary for them to be buying or using your services, especially depending of what you are offering or the type of data that you are asking for them.

8:44

So Evelyn, what are the way that you know putting in place this guardrail and this, you know, data privacy elements in place can enhance a company trust and brand reputation?

9:01

You know, it’s very interesting when you look at Pew Research and you see, you know today what constitutes trusts for end users.

And a lot of it is around, you know, they’ve, they feel like their trust has been eroded.

You know, big institutions who they believed, you know, banks is an example, and others who no longer, some of them may not be as trustworthy as they once thought.

9:27

So in terms of how that parlays into, you know, what privacy or the opportunity privacy has in terms of brand trusted brand impact, when you think about, you know, that sort of transparency.

And it then promotes the idea that, oh, this company holds true to some ethical practices around data because today it is data that is becoming our new currency.

9:57

And it’s a form of value for an individual user.

It’s a form of their empowerment.

So when they look at that and they see that, you know they’re one, it’s going to be these the the way that data is going to be handled, plus exactly what they’ve agreed to to that when they work through through the product, they feel that those choices are easy.

10:18

You know that they’re easily able to say, you know, yes, these are the things that I’m agreeing to to do with my data and it’s easy at any time to change that.

I think that plays into a trusted experience and you see that like when you have products that are equal and quite often the difference is that control that is given to end users versus another one.

10:42

It can be a big revenue differentiator.

That’s what many companies don’t realize today.

With that in mind, in terms of what you said, you know, building that value, building that experience of trust, one of the things that I noticed in 2018, I was working for a British software development company focused on ad tech, marketing tech that had offices all over the world.

11:03

So when GDPR came out, there was a sort of finger pointing doubts of I know this is important but I’m definitely not in charge.

And so this sort of that, you know, there was the idea of that, you know and still in place of the Chief Data Protection Officer but some tensions and doubts between engineering and legal.

11:23

Who was really going to be owning this?

Who is going to be driving it?

Can you just walk us through, you know, for organizations that are out there that want to you know roll this out better and be more effective when it comes to achieving privacy across an organization, Who are the key stakeholders and how can you facilitate?

11:40

How can you create a culture of collaboration where people are working together and not feeling like I assume I’m doing something wrong, but it’s not my responsibility.

Like I said, this feat, this called sometimes well into the easy trap of finger point of well, I know it’s important, but it’s definitely not my response.

You know, it’s such a critical area that you’ve touched on, which is that privacy is a shared responsibility.

12:03

And you know, quite often there is a divide between legal and technical teams.

And it even is true in terms of how many regulations have formed that quite often it may be formed.

They may be developed by stakeholders who may not necessarily have a lot of technical or operational expertise.

12:25

And so when those regulations are handed down, legal teams might be quite often the 1st to, you know, really dissect them.

But what’s really interesting today is that many engineers are equally interested in privacy.

12:41

I have worked with many engineers who’ve waded through regulations and have tried to understand them and have tried to make sense of them.

And So what this really means in an organization when you’re trying to operationalize it, your stakeholders consist both of technical, legal, policy and, you know, human resources for other functional leaders because it touches so many different aspects.

13:05

So depending on the service.

And I’m going to use just for simplicity’s sake, a marketing service or an ad tech, because you’d given that example.

So in such an example, you know, it would be really a a a multifunction.

13:20

It crosses, you know, you’re going to need sort of a multifunctional approach.

You know, one is what is the regulation requiring 2, how does it impact, you know, operations for that service.

So you’ve got your product management, you might have, you know, marketing leaders 3.

13:39

You know the developers who are doing that, you know how do they align that to their current engineering methodology.

And for even user experience or your UX folk, you know how you’re going to design that in and really you want to have that cross functional group meet up front and sort of collaborate up front.

14:02

It’s no good sort of designing things and then going through a review process and bringing in your user experience people.

At the end of it, it has to be from the get with.

That in mind as you said from the get go if you know an early let’s say precede or seed you know level series startup is is you know growing across that lifespan let’s say eventually things go really well get to IPO post IPO.

14:29

How can privacy be scaled throughout this process.

You know, like some startups might feel that, oh, we’re just not even going to worry about it until until later on, but then they can have serious difficulties, at the same time not wanting to obsess over it too much.

In your experience, what advice would you give to startups that where they know privacy is going to be a factor?

14:50

Like you said in in if if we’re talking about e-commerce, we could be talking about health records, we can be talking about a lot of things that are very sensitive data over which people do want oversight, they do want protection, they do want the right to be forgotten or to be, you know, to have editing access.

What advice would you give to startups that are are going through this process in order to scale their approach in terms of how they’re dealing with data privacy accordingly?

15:16

You know this one is a hard one because quite often you know cash strapped start-ups especially when they were early phase, you know that might be the last consideration you know and quite often they may look at that in the same bucket as you know other compliance and other you know and and today there’s a multitude of those regulations and but here’s what I’m going to say that you know let’s say the example you’ve given you know where it touches a number of sensitive areas and for example let’s say it’s some sort of fitness application.

15:48

You know, which potentially could be if you don’t design that right from the get go, not only are you going to not win over the trust of users, but you’re going to run afoul of so many regulations.

16:04

And today that appetite from regulators for you know handing down those very large fines is such that you know it’s simply not worth it unless you’ve got an executive that has an appetite for some of those eye watering fines that we’re seeing.

16:23

You know so in how.

So the in terms of how you scale it, you’re really going to have to look at you know what it is you’re doing, make sure that you have the right, not just like the legal and policy, but that you have someone from a privacy engineering background who’s able to help build that from the get go so that you can replicate that across all your services in a very replicable way.

16:49

That’s.

Interesting.

Just to take it one bit further.

So for startups, as they’re expanding their teams, eventually it’s something that should be just, if I understand this correctly, you know, almost from a human resources perspective or talent that as you’re building your team out to be including, you know, details about what’s your experience working with privacy regulations and particularly in an increasingly multinational environment.

17:14

You know like classic example startups from Europe that are that you know deal with a certain set of regulations here often times are interested expanding to the United States but have to be wary of the fact that as they’re crossing the ocean in in one way, they’re also a set of legal things that need to be taken into consideration.

17:34

I guess it’s from for companies as they’re looking to build out their teams what what would be good questions to ask or even things as far as certifications that you think would be recommended in order to make sure that that that level of consciousness is going to be present in the start up.

So today, you know it’s there are a number of different certifications.

17:54

I certainly would recommend them, you know, where people are starting out because they give that excellent grounding.

But there is nothing that beats the practical experience of doing, you know.

And so you know someone who’s had that experience building it in from a an engineering perspective is going to be extremely valuable because today it’s not just about, you know, a set of policies that you set.

18:20

You can go just tick the box you know today.

It really is about, you know, the actual building in those capabilities into a service.

And so I think and the good news is that in terms of you know some of the geographic differences there are, you know there are differences.

18:40

I won’t say that, but they’re largely nuanced.

So someone coming from one, you know, area of Europe to the United States, many of those skills or many of the much of the knowledge in the lessons learned would be applicable in another environment.

I guess I’m going to kind of jump on on the topic.

18:58

But like with today’s digital business being basically boundary less in term of the geography in which the customer are like how can a company, you know approach the different, the fact that they will sell customer and different geography?

19:21

Like, do they need to, you know, does the same set of rule applies everywhere?

And if not, how can they tailor out to each one of?

Them, you know, it’s.

I always find it so interesting how different compliance entities build up.

19:39

Either you know compliance frameworks or in the case of privacy, you know different regulating bodies will build up regulation.

But without often thinking about the similarities or the differences between, you know, all of those different regulations and how any corporation would be expected to.

20:00

To navigate that, it’s caught.

You know, it really does feel like we’ve reached what was referred back to in the early 2000s as compliance fatigue, as maybe privacy fatigue.

Because I use this as an example in the United States, there are probably over 12 plus different state private, maybe even more than that and state privacy regulations and a very large number that are in bills pending working their way through the, you know, process.

20:30

And then we talked about earlier about how privacy regulation just worldwide has really doubled since 2018.

And so the first thing is when you’re thinking about how you’re going to apply to an organization, you really want to look for those similarities and you know look for a harmonized framework as because you really don’t want to be overwhelming engineers and your stakeholders with all the nuances between them.

20:57

Now you do ultimately you know for example if you had a marketing application and today I’ll just use the state privacy regulations, some are opt in, some are opt out, you do have to observe that.

But if you start with a harmonized framework and then you look to see, you know, where the nuances exist and where your customer base lies, then I think there’s the opportunity to design some of those, you know, treat 70% as the harmonized and as a common and then treat the other 30% as different versus trying to navigate 200 plus different regulations and treat them separately.

21:39

I hope that made sense.

Yeah.

It does so, So yeah so like build, build foundation that would cover you know the kind of entire spectrum of of all this regulation is would be the way to go.

21:58

And do you think this, I mean today I feel like this regulation are being more and more fragmented.

Do you think that’s the case?

Do you think at some point there will be like NF, there will be unified, you know like for instance in the US you believe that state will increasingly do it as their own regulation or do you think the general, the federal government will step in and come up with you know, one set of rules like how how do you see this law evolving?

22:35

I don’t know.

It seems so difficult to tell because we are hearing, you know, rumors off, you know, federal and increasingly seeing signs of it.

And it would be an ideal way to go because it would simplify so much.

Even though if so much of the world is turning out regulation, I do see so many, many more commonalities versus differences.

22:58

And the good news is that, you know we have industry associations like the Independent Association of Privacy Professionals that the same entity that also offers those certifications that you know produce a lot of these mappings that really enables, you know privacy professionals even privacy engineering spoke to to better navigate those.

23:20

So I do think that the resourcing from these associations is helping enormously in terms of looking at a way to sort of address things more holistically versus as one offs.

And the other thing I do want to add, sorry is there’s also a myriad of privacy tooling that is coming to be, you know, we didn’t have that three or four years ago.

23:45

And some of those also have inbuilt sort of frameworks for being able to, you know, comply holistically but then report out separately as required.

And even also have excellent capabilities in terms of how do you map data flows.

24:02

So a lot of capabilities that businesses need to and especially you know, addresses that earlier question of how do people scale in a manageable way through using a lot of this tooling and resources.

You know once you have your people and your your processes in place that that can really help things.

24:20

Yeah, absolutely.

That’s why Bart and I created this this podcast because ultimately it’s the crossroads of engineering and legal and engineering competent is absolutely necessary to be able to to scale in to avoid this privacy fatigue, right that we mentioned earlier in this episode.

24:44

And so without using tool that will scale this for you, you could end up being spending a lot of resources and not doing it right.

So yeah, that sounds like a solid solid starting point.

24:59

Look look for tools that can do this for you.

Absolutely.

So now we’ve talked about what company should do and why should they do it and I think it was like a very positive way of framing things.

25:14

So now let’s talk about the other side of the coin.

Let’s say that you know there’s been an incident.

How can company recover from this?

How can they, you know, come back with a better privacy posture?

25:35

Yeah.

And it’s sort of a weird thing.

It often doesn’t make.

It’s very hard for, you know, companies to believe that they can not only recover, but do so in a way that they really enhance their brand and reputation.

25:54

So what I’ve seen sometimes is when some of these incidents happened, it’s really forced companies to really examine every aspect of how they are handling data including what they commit to customers and even their outward messaging and including you know, from that incident, you know what did they do, right.

26:17

You know the the post analysis of that is a critical thing in terms of how do you recover and come back stronger, What messages did you learn from how your customers reacted and the pay that they felt, how did you empathize with that and you know what are you going to do that’s substantially different.

26:39

So I think taking those lessons, I’ve seen companies really, you know, revamp their policies, their processes, the interfaces within their services to become a really, you know, trusted gate guardians of data.

26:58

Very similar to the theme of your or your name of your podcast.

Data defenders.

Truly to embrace that in every aspect of their business.

Is what a security friend once told me.

There are two kinds of companies, the ones that know they’ve been hacked and the ones that don’t.

And when it comes to privacy, mistakes or an incident is.

27:16

It’s not to say that it’s that black or white in terms of the dichotomy, but that the larger an organization gets that there could be some kind of a slip up.

It’s probably increasingly probable.

And like you said though also it’s not 2018.

There are lots of tools that are available that weren’t available then.

27:33

And so organizations I think are much better equipped to address this internally as well as externally to their stakeholders.

If a, if a you know some kind of a a data breach or you know a leak happens as we’ve seen them, you know time and time again I’d want to say almost every week, but you look around for it, you know probably not too hard to find.

27:52

We mentioned previously about how privacy is now being dealt with it by, you know, well known governments at a very high level.

Another thing that we’re hearing about more and more and it just doesn’t seem to stop, is artificial intelligence.

And but of course, it really means something different to every person depending on the context.

28:12

If we’re talking about AI and privacy, what do you see?

What?

How do you define AI?

What What do you see as a connection between AI and privacy, and how can trust be a differentiator?

You know, it’s so interesting because I feel like artificial intelligence that, you know, ability to apply sort of, you know, some sort of algorithm to derive outcomes, you know, based on a set of data that may or not may not affect an individual has been in place for some time.

28:43

But I think with the advent of large language models, you know, we’re now moving to the second phase where we’re seeing all sorts of capabilities.

And there’s a lot of hype.

And there’s also a lot of, you know, real use cases that have huge implications for our lives, for our jobs and for the way that data about people and individuals may be used to arrive at certain outcomes that could have, you know, serious consequential outcomes for them.

29:22

I feel like we’re at the point where you know, where this is a great sort of, I think it’s one of those sort of turning points in history and in technology.

And from a a trust and a privacy perspective, I see two things.

The first one is things about explainability.

29:40

Fairness, you know, bias, some of these security, accuracy, they all sound like buzzwords.

But as key tenets to, you know, policy, these are becoming increasingly critical in terms of how you know.

29:57

Companies can help build trust by thinking about how those models explainable, how they fit, You know, are they safe?

Are they accurate?

Can humans be involved?

Can they, you know, override the process if they choose not to have to be part of that?

30:18

Those sorts of things, these these considerations are becoming increasingly critical.

With that in mind on both sides, you know, it seems often that regulatory agencies are behind when it comes to the, you know, how quickly technologies are moving.

30:35

How can regulatory agencies work with companies to stay up to speed?

And with that in mind as well, how can companies also future proof themselves when it comes to how they’re protecting privacy in order not to risk their consumer or user’s data or anything that might be not, you know, of course, illegal, but that people are not willing to to simply hand over?

30:58

So the, you know it’s an interesting thing.

I’m going to answer the first part of the question which is around regulatory bodies and sort of you know moving to more proactive posture.

So one of the things that I’ve seen, you know, which is what I think works really well is where regulatory bodies, they put out a standard or a framework and they invite feedback.

31:18

And quite often what I’m seeing is a collaborative dialogue that’s starting to emerge where you know, there’s an opportunity not just from businesses but even individuals to have input into, you know, different aspects of regular regulations or frameworks.

31:37

And in terms of how that translates into, you know, how then does an organization address this myriad of like from a, you know, from a fire hose.

And I think part of it is that today increasingly, like for example NIST over here, European Data Protection Board, they increasingly are hosting forums where a lot of this is coming together in a much more holistic way than it has before.

32:03

And I think that becomes an opportunity.

They’re putting out resources increasingly that you know, are aimed at helping organizations come up with, you know, a holistic framework that addresses privacy, risk management, parts of AI and compliance in a much more well-rounded way than previously.

32:25

And this, I see, is a very welcome development.

Good.

No, I I think like you said making it as collaborative as possible so that even at the individual level that someone can voice their concerns because some of these things are being, I don’t want to say it’s the Wild West, but there is the some of these things are being done, you know, building on what’s on on what’s what we’ve been looking at, you know five years of GDPR it sounds like feels like a really, really long time and there was a pandemic in the middle.

32:55

But I think that moving forward, people being aware of the fact that they can be a part of the conversation, I think that’s very empowering and just knowing like you said the the tooling has built out a lot, There are more conversations happening.

It’s I would say easier and easier to get closer to those conversations.

33:13

As well as the fact that like you said from a user experience perspective or that consumers are being more demanding there.

We can say upskilling when it comes to their privacy vocabulary.

And companies I think are also being guided into making it easier for folks to understand these things.

So it’s not just written in legal jargon or fine print that takes too long to to go through that will simply turn folks away.

33:35

So I think that moving forward it’s it’s a it’s a more positive landscape, it’s a more transparent landscape where it’s easier for for consumers as well as regulatory bodies to to be informed, to make better decisions.

And so I think that’s a that’s a nice note to include in the conversation because often times with privacy there can be doom and gloom scenarios or you know where are we headed.

33:58

But what I’m getting from this curious to your thoughts of on is that the the current situation is is much better prepared than we may have been, may have felt five years ago where it’s sort of like well we’ll just wait until the large fines start coming from Brussels and then we’ll start doing something about it.

34:15

And I think the lesson has been, you know, don’t wait to become part of the conversation.

Do so as soon as possible and and the and like I said, enjoy the fact that we have the availability of so many resources now that perhaps we didn’t have some years.

And maybe, you know, I’m being overly optimistic here, but I honestly do believe that quite often, you know, we it is the bad news that makes the news.

34:40

How often do we hear the good news stories of where companies are doing things right and things.

And I’m not saying that all these, you know, I mean we increasingly hear every day about these violations.

I’m not ignoring that, and I do think that they’re very serious and that there is a whole side of, you know, an industry that does need to be overhauled.

35:03

But I am increasingly aware of so many proponents in our industries who are focused on good and who are focused on their end users.

And I just felt like this forum was a perfect opportunity to highlight that.

Yeah, what you mentioned about about the the way to involve kind of all the actor to the conversation can be aligned with or compared to some concept in technology such as DevOps movement, right?

35:35

Which was meant to to break the barrier between the developer and the operation.

I don’t know here if we need to come up with a legal dev, a league dev movement.

Or a dev pre OPS or a dev you know, just have we have dev SEC OPS.

35:54

It’s a dev now.

Dev SEC, pre OPS, You know, because I really also see security and privacy is two sides of the same coin.

Yeah, so maybe we need to coin the term here.

Yeah, now I think, I think it’s and is exactly what you said.

36:14

Silvan is this idea of breaking down silos.

It doesn’t mean that every developer, you know, every engineer needs to become an expert on this.

But nowadays with with so many industries being affected, to simply act like, oh, it’s out of sight, out of mind.

36:30

I’m not a lawyer.

It’s not my job.

I don’t think it’s fair.

And with the technologies that are being used and and like I said some, some areas are more sensitive than others when we’re talking about people’s zip codes and addresses and credit card information or or healthcare records.

But there are plenty of other cases as well where if folks are aware of what’s what’s available to them from a privacy perspective, but they’re able to make informed decisions about how their data is being used and and know what’s going on behind the scenes, the companies to to whom they’re giving their trust.

37:01

So I think that despite some of the challenges, I I very much agree with what you said, Evelyn, that all the success stories that are making dates, you know, we we where you know where it bleeds, it leads.

So if we see $20 million, a $100 million fine you know to a major cloud provider or to some kind of financial services organization that really catches our attention.

37:22

But it’s quite rare that a newspaper will publish company gets 100 out of 100, you know regarding their privacy score or you know that consumers are very satisfied by this, perhaps you know consumer rankings regarding this and seeing trust in brands based on how they’re approaching this in good times and in bad is something that that we can be mindful of.

37:42

And I think so long we can also try to have a bit of a spin off of including positive news about how companies are doing this correctly in the vast majority of the cases.

So getting those those unsung heroes some some credit for the work.

So let’s, let’s see the last question, Evelyn, going forward, you know where do you see all of this going and what would be the last thing you know that you would recommend organization to think about or do to really future proof, you know, privacy as one of their core business differentiator?

38:27

You know, it’s a very hard thing to predict in these times because I feel like some of my predictions, you know, have not come to fruition while others, you know, have been turned on their heads.

But just based on where I am from now, I feel that we definitely headed into data as a currency that is going to come.

38:51

You know that increasingly it’s becoming clear that it’s has a value.

And let’s say that some of these predictions play out that, you know, we may work potentially less in the future and we may look for other alternative streams of revenue.

39:11

Then, you know, that might be one that could potentially help play out that, you know there’s a certain way that individuals can earn from their from the use of their money and from the use of their data and from key aspects about themselves and their livelihoods, but in an informed way.

39:32

And in terms of future proofing, I really think it’s about the way that services are designed.

If more is, I think my recommendations would be that organizations focus from building that in, from building key, you know, from designing their services with end users in mind, with fairness, with explain ability, with promoting A trusted experience.

40:03

So users feel like that they’re in greater control is the best way that companies can begin to future proof their business.

I think it’s a very nice note to finish on, you know, talking about how things have developed in 2018.

40:19

If you had told me that a significant portion of, you know, privacy will involve user experience, I would have thought what’s the idea with that?

This is, you know, this is a security thing.

This is our, you know, our our IT OPS team, this is, you know, our, our big data teams are going to be the ones handling this.

40:36

And there’s really a lot to be said for that, that we can’t overlook the details when it comes to this.

And whether it’s organizations with hiring engineers, making sure that whether to certifications or as you said folks that have built technologies, platforms where privacy is being respected.

40:55

And that they will point to that as a way of saying not only are they responsible and ethical as a person but having the technical know how to make sure that that the whether it’s software whether like I said platforms are being built according to these standards.

That it’s not the scenario we were experiencing five years ago.

41:13

When I remember even in 2018 some startups trying to build tooling but it was too early and so there was a lot of confidence in investing in in in those kind of tools then whereas now that’s seen as a much greater asset.

The other thing that you brought up too that I really find relevant is not seeing this as a cost, but rather an opportunity given the fact that data is new currency.

41:37

And and also we all know, you know that where costs are saved through not getting fines and not losing trust through brand recognition around around how data and privacy are being respected.

There is an unfathomable amount of of money as well as credibility that as an opportunity that can be gained there.

41:56

So, Evelyn, I really thank you for sharing your time with us today.

I really thank you for inspiring us to patent the term privacy Privacy Officer.

We’re going to go about that and we will be sure to include royalties for every time that word is used, no matter no matter how this goes, I really thank you for joining us.

42:15

If people want to follow the stuff that you’re doing, what’s the best way to do that?

So good question.

I maintain a strong presence on LinkedIn and also on Twitter now X and would love to continue the conversation going forward.

42:34

Perfect.

Thank you so much for your time today.

Thank you, Evelyn.