'A really heavy appetite’: How and why the Crown Prosecution Service is embracing AI
Crime is changing; law enforcement must do the same
2020 saw the world stuck at home, forcing companies to change how they approached work. It was a time of outstanding innovation...or was it?
“People talk a lot about the pandemic being a real time for innovation,” says Gemma Hyde, deputy director of digital at the Crown Prosecution Service. "I don't think it was: it was a real time for automation.”

CPS, like other organisations, “had to stop some of the true innovation we were doing [in 2020] to just focus on getting the automation stuff out there, because that's what people really needed.”
Five years on and the dial has ticked back the other way. Now automation isn’t blocking innovation at CPS, but enabling it.
Actually, that’s a bit of poetic licence: Gemma named automation as an efficiency driver, such as in the new Casework App. AI is the innovation catalyst.
Last year CPS worked with NTT Data to test the concept of a new system called Case Explorer, built using generative AI. CPS lawyers could use Case Explorer to pick up a new or existing case, view key information at a glance and interrogate it to drill down further into the details – similar to experiments by private law firms like Linklaters.
“All the lawyers we went out to, every single one, had heard of gen-AI and ChatGPT, and about 50% of them had actually used it themselves.”
Considering some of those lawyers were pretty technophobic, that speaks to the ease and penetration of AI platforms. Gemma says there's “a really heavy appetite” to explore the technology, with good reason.
“Case load now is more than double for each individual lawyer what it was pre-pandemic. What can we do to reduce that burden? It all helps.”
Most of CPS' focus today is on low-risk applications, like providing policy guidance for lawyers, but “riskier” uses are also being explored – if for nothing else than to find the limits.
“They might not be right for us, but by looking at those, we'll work out where our red lines are as an organisation.”
At the same time as Gemma speaks passionately about CPS’ AI use, she admits it’s not a silver bullet.
“Just because we can doesn't mean we should. That's my personal and corporate view on it.”
She continues, “We do a risk and harms workshop for all of the use cases we come up with, before we even take them into a discovery [phase], thinking about what could go wrong [with the technology]; but also, things like what could go wrong for our people? Are we going to inadvertently de-skill our people if we use AI for that?”
Luckily, lawyers are quite good at tackling questions like this, with “very strong views” on what's ethical and what's not.
Acknowledging the potential for harm is important, because CPS has no choice but to explore AI and other technologies that can make the organisation more efficient.
“Day in, day out, the nature of crime is changing, it’s becoming more challenging to deal with, so the only way we can respond is to continuously improve.”