The sky’s the limit: How the CAA is shaping smarter, safer aviation

In a fractured industry, standardisation reigns supreme

The Civil Aviation Authority encourages companies to play nicely with each other, but it’s not above copying homework.

Regulation is not an easy task, even for agencies that oversee a homogenous sector. For those like the Civil Aviation Authority (CAA), which have to weld many different types of company into a cohesive industry, it’s even more of a challenge.

“Our remit is everything aerospace,” says CAA CIO Matt Taylor. And that means everything: from pilots’ licenses, to aircraft approvals, to flight simulators, to making sure it’s safe to let off balloons at your summer fête (“So we can make sure they don't drift into military or civil airspace”).

The CAA runs the ATOL scheme, a financial insurance product; regulates space operations and aviation cybersecurity; and is even the registrar for births and deaths that happen in the sky (“Doesn't happen very often, but does happen sometimes”).

“There's loads of different things that we do - some of them in very small volume, but an awful lot of different things. And data's key to all of that, in very different ways.”

All those different parts means lots of different datasets, and joining them up to work together has been one of Matt’s major aims since becoming CIO in 2018.

“I created a central data analysis team in my team so we could join together all of those people across the CAA who are basically data analysts or data engineers of some sort with different levels of proficiency, different sorts of skills and tools, and we tried to standardise that as much as possible. So, some consistent training, consistent cloud-based data platforms, a small set of data analysis and presentation tools.

"That's given lots of colleagues much more power and capability, but it's also meant that we can do a much better job of analysing across the organisation than we were able to previously.”

Built on sharing

Like the oil and gas firms in the North Sea, the UK’s aviation companies share plenty of data with each other, although in this case the impetus didn’t come from the regulator.

“There are generally no barriers to sharing information where that's seen to be something that has a safety relevance... so there's that kind of culture in there already.”

Aviation isn’t as simple as supplier and customer. There are airlines, airports, maintenance staff, baggage handlers, air traffic control and many more parts that need to work together to make sure flights leave safely and on time.

“There's a bunch of people who need to have common access to the same information so that that can be as smooth as possible,” Matt notes.

But, he admits, “that doesn't work nearly as smoothly as you might think.”

Terminology, it turns out, is a major blocker. Even something as straightforward as an aircraft’s arrival time might have many conflicting sources of information: from its departure airport, from air traffic control, from the schedule and from the aircraft itself. Likewise, departure time could mean when all the checks are completed, when the plane leaves the gate or when its wheels leave the tarmac.

Standardising a single approach is key to efficiency.

“If you could be really confident in what was the truth, and you knew when that aircraft was going to land, you could run your airport in a much more real-time way.”

Having better insights can reduce queues, maximise shopping time (and thus airport revenues) and save time for airport staff like baggage handlers. That’s where AI comes in.

AI in the past, present and future

Some airports are already experimenting with AI. Copenhagen has an AI-enabled platform where “all the different parties in the airport” can collaborate, while closer to home Heathrow is experimenting with many different applications.

These AI systems can more accurately predict arrival and departure times and manage passenger flow around the airport. In the future, they could even be used in air traffic control - already being trialled at Heathrow.

Image
Description
AI is already assisting with passenger flow in airports – and in the future, will be able to predict and avoid queue buildups

Of course, AI in aviation will need to be tightly controlled, but the sector has an advantage there: aircraft are already highly automated, “so it's not massively different when the type of automation changes,” according to Taylor.

The CAA is trying to pre-empt any issues with a framework of AI principles for the sector, which is “very similar to lots of other [frameworks] - so, worrying about fairness and bias, accountability and governance, transparency, contestability and redress...and of course, safety, security, resilience, those sorts of things.”

“We don’t want to be the people throwing jelly at the wall to see what sticks”

Copying other people’s homework is only a bad thing at school; here, it’s just good sense. Matt says the CAA is "keeping a good eye on what everyone else is doing.”

“Ideally, what we want to do is see where other people have made a genuine business-benefiting adoption of AI and then do that. We kind of want to not be the people that are throwing jelly at the wall and seeing what sticks, because lots of other people are doing that. We'd love to take advantage of the learnings of others and do that.”

Recently, the CAA has been growing more refined in its risk management. As AI use grows, the regulator is streamlining the rollout of new applications with a series of use cases it can utilise for risk assessment.

The cases range from creating content and analysing data to modelling scenarios and decision making, with “fairly standard” risk analysis applied using the principles framework. For example, decision making risks might include, “Is enough information being captured?”, “Do we know how the AI made its decision?” and, “Is there a clear audit trail?”

This approach gives the CAA a quick assessment of AI risks and mitigations, speeding the rollout of new applications – which is important, as it can be difficult to justify the cost of AI.

“We try really hard to be cost-recovery-only, so we tend to run relatively lean. I can't go and spend £200,000 just to see what would work - and for Copilot for all of the CAA, we're talking half a million pounds a year.”

Smaller organisations tend to run up against this issue when it comes to modern AI tools, forcing a more complex approach than a simple company-wide licence; for example, they might adopt a per-query model.

Whichever approach you use, this will be the year in which you need to prove AI’s value.