Labour party plans to force AI developers to share test data
Existing voluntary agreement would be made legally binding
It took too long for lawmakers and regulators around the world to identify the risks that social media posed. Labour wants to avoid repeating the mistake.
The Labour party has gone public with proposals to compel AI companies to share road test results of their technology with government officials.
A voluntary agreement between tech companies including Meta, Google Deepmind and OpenAI and multiple governments around the world was struck last year at Bletchley Park, with some fanfare by the Prime Minister. The tech companies agreed to "co-operate" on testing advanced AI models before and after their deployment.
Labour's proposals go significantly further. A Labour government would give the existing agreement statutory footing, forcing AI companies to share test data. AI firms would be legally obliged to share with government officials whether they were planning to develop AI systems over a certain level of capability, and they would need to conduct safety tests with "independent oversight."
Peter Kyle, Shadow Secretary of State for Science, Innovation and Technology, is in the US this week and immersed in discussions with both the US government and executives from the tech giants.
Speaking on Sunday to the BBC's Laura Kuenssberg, Kyle indicated that Labours proposals should be seen in the context of previous failures of legislators, regulators and social media companies to recognise and act on the risks posed by social media to young people in particular.
At the end of a week in which Mark Zuckerberg was accused by US Senator Lindsay Graham of having "blood on his hands", Kyle said that we need to start "getting ahead of the curve" when it comes to the potential for chatbots, deepfake and other AI generated content to damage vulnerable, and particularly young people. To that end, Labour want legislation and international support in place sooner rather than later.
"We will move from a voluntary code to a statutory code," said Kyle, "so that those companies engaging in that kind of research and development have to release all of the test data and tell us what they are testing for, so we can see exactly what is happening and where this technology is taking us."
Kyle said that the results of the tests would help the newly established UK AI Safety Institute "reassure the public that independently, we are scrutinising what is happening in some of the real cutting-edge parts of … artificial intelligence".
Commenting on the proposals, Victor Botev, CTO of Oslo based scientific AI start-up iris.AI, emphasised the importance of transparency.
"The ironic thing about OpenAI is it is anything but open," he said. "We don't know what data GPT-4 is using for testing and training, which may have partly motivated the Labour Party's move.
"As AI becomes more ubiquitous in our professional and daily lives, transparency remains crucial for the technology to advance safely. AI that properly cites and links back to sources wherever possible will allow users to verify facts and give due credit to the original data source.
"Models that traceably ground their outputs in source materials will do far more to build public understanding and trust than opaque 'black box' approaches."
Tim Wright, AI expert and technology partner at City law firm Fladgate provides a rather more cautious perspective. He said:
"Whereas Rishi Sunak's government has said that it is in no rush to regulate and wants to avoid alarmist soundbites, Labour wants to force AI firms to report before they train models over a certain capability threshold and to carry out safety tests strengthened by independent oversight.
"We can expect to see more political machinations of this nature, as the election approaches, as each party seeks to gain the higher ground over issues such as transparency and accountability around potentially biased or unfair AI systems, to ensure that they are made safe before being unleashed on an unsuspecting public. Labour's plans seem laudable at first blush but a lot more work is needed to ensure that they don't discourage investment and innovation in the UK."