How quantum computing will revolutionise AI and machine learning
Quantum computing promises to change the face of high-performance computing (HPC) by accelerating future data processing capabilities and creating a radical change in how computers perform.
Moore's Law predicted the number of transistors on integrated circuits will double every two years, but those transistors are now as small as we can make them with existing technology. This means we're approaching the limits of what classical computers can do.
At the same time, the demand for processing bigger and more complex data is increasing across healthcare, defence, energy and finance. These sectors need greater accuracy, precision and scale of calculations than what is currently available, even with state-of-the-art optimisation and simulation methods on traditional computers.
While quantum computing has emerged as the solution, the route and timing to delivering useful systems are not entirely clear. However, what is clear are the use cases for large-scale quantum computers, using millions of qubits and employing error correction, but these devices will probably not be available for another few years. So, if you're interested in quantum computers sooner and how they could work with AI and machine learning (ML), you need to explore non-error corrected systems (so-called NISQ devices or Noisy Intermediate Scale Quantum) which are currently causing a heated debate about their usefulness.
How quantum works
If you read any standard description of quantum computers, you'll learn that they use ‘qubits,' a basic unit of quantum information, allowing them to process ones and zeros simultaneously in a state called ‘superposition.' With large-scale quantum computers, due to the exponentially increasing permutations of two, three and five, that a million qubits can adopt, there are certain algorithms which allow these systems to vastly reduce the time it takes for them to perform some of the complex maths required to analyse datasets.
However, to enable quantum systems to be useful in the short term, companies are becoming more and more innovative in their approach. Increasingly, the community are starting to move beyond mere ‘qubits' to ‘qutrits', and ‘qudits', which are multi-outcome quantum states (zero, one or two for a qutrit, or zero, one, two and beyond for a qudit). And some people are even exploring analogue quantum computers.
There are also an increasing number of physical platforms that promise to process quantum information faster. While many are already familiar with ‘superconducting' qubits within circuits maintained at temperatures colder than deep space and shielded from magnetic fields, room temperature alternatives are emerging such as those that use photons, single units of light.
Accelerating AI and machine learning
One way to make quantum available sooner is to find promising applications that don't require millions of error-corrected qubits to be useful. Current quantum computers contain up to around 100 qubits or qudits. This is enough to reach quantum advantage, where they can be used to solve specific contrived maths problems faster than any classical computers. But the race is on to map these small-scale systems onto useful applications such as AI and ML.
AI and ML are particularly promising fields since they rely on being able to identify and make use of complex correlations in whatever data is available. These fields currently use more and more compute-intensive classical computing resources to build models that can identify these correlations. As small-scale quantum computers intrinsically process data in a highly correlated way, they may be useful for some AI and ML applications. Just last year, a research group in the US was the first to use a small-scale quantum computer to solve a classification task faster than any classical computer. Even though this classification task was somewhat contrived and designed to work better on their device, there is a lot of ongoing research to find more practical use cases.
Two areas of AI and ML that seem particularly promising for quantum are generative modelling, for example finding potential protein sequences for biotechnology or improving natural language processing, and optimisation tasks such as improving the operational efficiency of supply chain logistics. Solving each of these problems requires accounting for particularly complex correlations (for example between words in sentences, or DNA bases in a genome), so vast amounts of traditional computation power are currently required. Near-term quantum computers, therefore, have a good potential to be able to solve at least parts of these tasks using less time and compute.
Next steps for quantum
It's likely that quantum computers will soon be able to solve larger and more complex computational problems that are beyond classical computers. Even though large-scale quantum computers are still some way away, the present availability of small quantum devices that can already solve specific problems faster than traditional computers, means we are close to creating useful applications. This also means we're taking a step toward revolutionising AI and ML capabilities and will soon be able to unlock new fields of applications, accelerate innovation and transform industries with quantum.
William Clements is head of machine learning at ORCA Computing