UK Data Use and Access Bill reduces safeguards say peers

AI measures conspicuous by their absence

Image:
UK Data Use and Access Bill reduces safeguards say peers

The Data Use and Access Bill (DUA), which had its second reading in the House of Lords yesterday, is intended to “harness the power of data to drive economic growth, support modern digital government and improve people's lives”.

The DUA Bill is forecast to generate £10 billion over 10 years, according to the government.

It is a replacement for the Data Protection and Digital Information Bill (DPDI), which was introduced in the previous Parliament.

Among other measures, the lengthy Bill would rationalise data formats used in the NHS, speed up police data processing, improve coordination of roadworks for laying pipes and cables, alter the powers given to the regulator the ICO, digitise registers of births and deaths, and introduce a centralised digital ID verification framework.

Perhaps most contentiously, it changes the GDPR rules around automated decision making. Since the DPDI, these rules have been revisited by the Labour government, with “previous measures watering down the accountability framework, along with other measures that risked protections” said Baroness Jones, presenting the bill.

Nevertheless, plenty of criticisms remain.

Automated decision making

On the issue of individuals being able to challenge decisions that may have been made with no human in the loop, such as in automated hiring and firing, policing and immigration control: “How can somebody effectively assert their right if they do not even know that AI and automated decision-making were in the mix at the time?” questioned Conservative peer Lord Holmes, one of many who raised the lack of attention paid to AI in the DUA Bill.

“Currently, automated decision-making is broadly prohibited, with specific exceptions. This Bill would permit it in a wider set of circumstances, with fewer safeguards,” said crossbencher Lord Vaux.

What about AI?

The government appears to be playing a waiting game, with the wording of its own much-anticipated AI Bill likely to be affected on what the incoming US administration decides, which crossbench peer Baroness Kidron said was “concerning on many levels”.

Labour’s Lord Stevenson added: “We will need to address the governance of AI technologies in the very near future. It does not seem wise to delay, even if the detailed approach has yet to be worked through and consulted upon.”

Several peers, including Kidron, also said the Bill fails to tackle present-day or anticipated uses of data by AI, including the unlicensed use of creative works for training models and the impact of GenAI on children in terms of unproven edtech and the risk of sexual abuse and exploitation.

Privileged access for UK researchers and innovators

Welcoming the improved access to data for researchers, Kidron nevertheless said more should be done to ensure benefits accrue to the UK.

“We have a unique opportunity …. with unique publicly held datasets, such as the NHS’s,” she said.

“However, we are already giving away access to national data assets, primarily to a handful of US-based tech companies that will make billions selling the products and services built upon them. That creates the spectre of having to buy back drugs and medical innovations that simply would have not been possible without the incalculably valuable data.”

She called for UK innovators and researchers to be given privileged access to such datasets, so that the benefits are kept in-country.

Burden on business and questions over data transfers

The burden on small businesses that do not hold much sensitive data troubled Conservative peer Lord Markham. He asserted: “We have concerns that the Bill will disproportionately add to the weight of the requirements on those businesses, without greatly advancing the cause of personal privacy.”

Meanwhile, fellow Conservative Lord Bethell questioned the lack of provisions to prevent transfers of personal data to jurisdictions without adequacy, including from Chinese EVs and genomics firms, and crossbencher Viscount Colville worried that “legitimate interests” of data processing businesses might be favoured over the rights of individuals in the breakneck pursuit of economic growth.

Also on data transfers, crossbencher Lord Thomas warned that Britain could find itself caught between the unpredictable demands of the US and strict EU adequacy measures. “It is a very important aspect of this legislation that we look at how, in the transnational market in data, which is of immense value and importance to us, we protect the British public.”

The Bill will now be considered by a Grand Committee.

Commenting on the deliberations, Jim Killock, CEO of Open Rights Group, told Computing: "The Bill undermines protections against AI by rolling back the right to human review of AI decisions.

"We should be making AI more accountable, not less.

"The Bill misses an opportunity to make the ICO directly accountable to Parliament for their performance and to law courts for their decisions. This needs to be added, because data protection law depends on actual enforcement, which is currently severely lacking."