The AI-shaped headache for privacy professionals

Data protection officers from Canon, WHSmith and Prudential explain the new challenges that GenAI brings

Image:
Do you know everywhere AI is being used in your business?

With technology and the rules governing in a state of considerable flux, businesses need to ensure their own houses are in order now, rather than trying to second guess what might be coming around the corner.

The emergence of GenAI has shaken the governance world, bringing - along with many potential benefits - new risks to data security and privacy. Meanwhile, a global regulatory standard seems as far away as ever. In the US, Trump has lost no time in tearing up his predecessor's executive order addressing AI risk, and in the UK, while teasers are starting to emerge, the exact shape of the AI strategy is still unknown. Europe, of course, has established a risk-based approach, as has China, and around the world countries look to balance safety with giving entrepreneurs a free rein.

‘You've really got to understand where AI is within the company’

Whatever might be happening in places with few or no rules, multinational organisations must take their cue from jurisdictions where regulations apply, said Steve Wilkinson, privacy and AI consultant at insurer Prudential, speaking on a webinar organised by OneTrust.

The sensible approach is to adhere to "gold standards" on data protection, such as the GDPR, across the board, while keeping a weather eye on what's happening around the world. The legislation is "not going to be black and white", he said. The landscape is fluid and the onus is on individual companies to navigate through it.

"It's down to your company to sit down and think, OK what rules do we put into place?"

Those rules will depend on jurisdiction and the nature of the organisation, but more fundamentally on what the systems deployed are actually doing, which for many organisations is a huge grey area.

"You've really got to understand where AI is within the company" Wilkinson went on. "There's a huge amount of AI governance that you've really got to sit down and understand."

AI systems procured by departments may be leaking data or introducing bias. Mission creep may lead them to be deployed where they shouldn’t.

"Have we bought an AI system off the shelf? Where is that data? Do we have the right levels of consent? Are you going to use it to profile people? Are you going to use it to watch people in shopping arcades to gauge their facial expressions? How long are you retaining that data? How was that data used to train your models? All this has to be fully understood and documented."

‘It is really difficult to keep track’

Models trained (potentially illegally) on opaque data sources, GenAI's inherent unpredictability, the problem of shadow AI usage by employees (who knows where that data fed into ChatGPT is ending up?) together with a plethora of overlapping regulations certainly makes the data protection officer's job harder.

Image
Description
Lizette van Niekerk, group DPO, WHSmith

Lizette van Niekerk, group data protection officer (DPO) at retailer WHSmith, spoke about the difficulty in staying ahead of the curve.

"It's really challenging to understand whether you are subject to [a regulation], or just parts of your business, or any of your brands, and to understand how they overlap and what the requirements are," she said. "It is really difficult to keep track and to make sure that the accountability framework we are building keeps up."

Education and AI literacy are critically important, she added. Last year, WHSmith launched an AI policy, making it compulsory reading for every single person in the company. It has also made its way into refresher training courses on security.

Currently, van Niekerk's team is looking at specialist training for higher risk departments and use cases, such as internal software development, HR, people teams and onboarding and recruitment. "If they are using an interesting tool that a helpful vendor has told them will make recruitment so much easier, we have to understand it. Is it fair? Does it meet our requirements for using that technology?"

‘You need strategic influence, effective communication and practical tools’

Done the wrong way, this could be perceived nosiness, casting data protection and privacy professionals as unwelcome enforcers. DPOs need a strategy.

Make sure you have board-level support, urged Fred Oberholzer, EMEA privacy director and group DPO at Canon.

Image
Description
Fred Oberholzer, EMEA privacy director and group DPO, Canon

"Educating a company about privacy involves a combination of strategic influence, effective communication and practical tools, like monitoring controls. AI is no different in this regard.

"In the first instance, you need to ensure that executive buy in is achieved. Make it a business priority. Explain how strong privacy practices and AI governance align with company values, build trust and increase a competitive advantage. If you get executive leadership buy in, then you are on the road to implementing effective governance procedures.

“Use real world examples as well highlighting high-profile data breaches or penalties under the GDPR or other relevant legislation to emphasise the financial and reputational risks of non-compliance."

As well as ensuring they have a seat in the boardroom, DPOs need to operate an open-door policy, said van Niekerk.

"If they come to me, when they start thinking of a new vendor or a new project, and say, Lizette, I just want to talk to you about this. Is it a No? Is it a Yes? Is it a Possible? That shows that you starting to be completely embedded in the organisation, and they see you as a helper, not a blocker."

And when talking to the board, demonstrate how you can save the company money, suggested Wilkinson, mentioning one company that had saved £3 million through a timely data protection intervention.

"As a privacy professional make sure you're getting into the project as soon as possible, go in there, have a look, and see where they can actually save money in relation to the project."