Profusion and Pinsent Masons LLP launch 'The Good Data Guide'

Because we can't trust AI creators to mark their own homework

Data ethics can be daunting

Image:
Data ethics can be daunting

‘The Good Data Guide’ provides a data ethics framework and comes as research reveals 83% of business leaders feel morally uncomfortable about how their company uses data.

Last Friday saw the launch of data consultancy Profusion's ‘Good Data Guide' in partnership with Pinsent Masons LLP. The Guide, built in consultation with representatives from London Stock Exchange Group, Oscar O'Connor & Co and others, provides a comprehensive and practical framework for how organisations of any size can use data ethically.

Speaking at the London launch, Profusion CEO Natalie Cramp said:

"We cannot trust the people who have put large language models into the world to mark their own homework. This means that data ethics have to be the responsibility of every single person in this room and every single organisation. Whether we like it or not AI is being used in our organisations, every single day."

Cramp was part of an expert panel chaired by Jeremy Kahn of Fortune Magazine, which also included Sam Nutt, Researcher & Data Ethicist at London office of Technology & Innovation, Dr Sue Chadwick, Strategic and Digital Planning Advisor at Pinsent Masons LLP and Doug Brown, VP Data & AI for NCR Corporation.

The launch of 'The Good Data Guide' was accompanied by the publication of results from a survey of 200 business leaders at medium to large companies. Some of these findings are troubling. 83% of leaders said that they felt 'morally uncomfortable' with how data is being used in their businesses, and one in three businesses believe that their customers would not be comfortable if they knew how their personal data was being used. Friday's panel began by considering just why data ethics seems to be such a difficult area for businesses.

Sam Nutt commented from the perspective of local government: "It's really important to have that cultural conversation and a space needs to be created for that. But that requires some skills to be in place at different levels of the organisation and in local government they don't exist in a lot of places. So I think that's a big hurdle."

Dr Sue Chadwick commented on the similarities between individual and corporate journeys, and that "rewiring for data ethics," presents as such a complex task that a big source of delay is simply deciding where to start. In Chadwick's view, starting small is the way to go.

"Do something small and regularly. Probably the first thing would be to have a policy. There are so many policies out there, so adopt some principles and then the next day pick one principle and make one change based on it. Then carry on."

Who is responsible for data ethics?

One particularly interesting area of discussion was accountability. Are companies choosing to make data ethics a board level governance issue or are they delegated to legal and compliance teams?

Natalie Cramp said in her experience the tendency was for the latter, with the exception of fast growth tech start ups which tend to be more data savvy.

Image
null
Description
Panel at launch of 'The Good Data Guide:' Image by Profusion

"Fundamentally," said Sam Nutt, "you're accountable as any organisation for the things you do, and that includes like the outputs of an algorithm and AI that you procured from someone else. Even if it's a black box, you're still accountable for the things that come out of that algorithm, the decisions you make, the impacts that come from that. I think that's a really important foundation to start from.

"As a local authority we have a specific responsibility to residents so one thing we really encourage is much more participatory data practices which is developing your products, developing processes of governance as openly as possible in that process and being transparent with residents. If you can get that info as you go, then you're far more likely to design something ethically by default. And I think that also carries over to business as well. You do user research on products, but can you make questions about ethics part of that user research?"

Impact of ChatGPT

To what degree have the large language models which became so widely available at the end of last year complicated the data ethics discussion?

Doug Brown considered some of the impacts of what he described as the democratisation of complex technology.

"In places like New Zealand, Switzerland and in Estonia, citizens have been actually taking more control over their data and getting a value exchange back for that. A second impact is trust in the responses you get from these models. If you can't trust the output of it, then the ability to make better decisions is going to be much more problematic. In businesses, you're probably losing a lot of IP."

Should everyone in an organisation have the power to build their own data models?

"Every technological advance or evolution comes with its moral and ethical issues," said Brown, "but we have to trust people to use it in the right way. In a data driven culture, you trust people as part of how you manage them. I think that's a fundamental building block of how we actually manage this. It's not about the technology. It's about the intent and the use, and the behaviour of the people who are using these tools."

Natalie Cramp emphasised the parallel between decisions of consumers on ethical questions such as the clothes we buy and how often we buy them, and the decisions of employers on data ethics.

"Data ethics will look different in every organisation," she said. "There's a set of principles that are sort of generic that everybody should follow, and then it depends on what it means for you. Charities and local government for example have to be a lot more careful than a tech giant which can absorb the occasional huge fine."

The limits of regulation

Whilst acknowledging that regulation would invariably lag behind the pace of the technology development, the panel was broadly welcoming of the forthcoming EU AI Act, and Dr Chadwick was also positive about aspects of the present UK government approach.

"I like that we've got this Algorithmic Transparency Standard, where you can see what other people are doing and share that knowledge. I like the idea of assurance. I would like to see some firm principles in law but also that pro innovation, quite creative and open approach that the Government is going forward with."

Ultimately, no organisations (or individual) should rely on a legislative, regulatory answer to the data ethics challenges that face us. When it comes to data ethics, each company has to decide where to draw that line, and the regulations that are put in pace need to be flexible enough to adapt as the technology develops. As Natalie Cramp said:

"Whatever you put in regulation it's just a good start. I think the danger is if people think that that's all that they have to do. GDPR wasn't perfect but it's all right and it gives us a baseline. But whilst we need to be compliant with that that shouldn't make us feel like our job is done.

"That's what ‘The Good Data Guide' is. It's not meant to be a tick list to work through. It's to give you questions to ask at each stage and each term of this so that as you move through your journey, and as new technology enters, it's still relevant. I think that's the challenge with some of the regulation. It's hard for it to keep up. If you ask the right questions, it can help you to get to the right culture as opposed to that compliance and regulation."

The subjects covered in 'The Good Data Guide' include data management and security, the lifecycle of data, creating ethical algorithms and automations, algorithmic transparency and building diverse data teams. It was written to reflect that technology development will aways outpace law.