Förderjahr 2025 / Stipendium Call #20 / ProjektID: 7728 / Projekt: Learning in the Quantum Regime
Quantum computers work in a fundamentally different way than classical ones do, which makes it an interesting research question, to study how we can use quantum information to reason and learn from data.
Loosely put, Machine Learning (ML) refers to the process of learning from data. This could, for example, be to classify data points (e.g., distinguishing cats and dogs in images), or to learn some real-numbered target function (e.g., housing price prediction). A lot of time has, however, passed since such examples were considered state-of-the-art.
Modern ML models are largely neural network based and most of them are based on so-called Transformer models. These started out as language models and have more recently been adapted to work for different data, such as images or videos, as well. One common denominator of all these models is, that they are incredibly big and their training and inference requires extensive computational resources and energy.
This is a primary reason why alternative computing methods for ML are being explored. Such include specialized model architectures, such as the ones developed by Liquid AI [1], or even models based on entirely different computing paradigms, such as neuromorphic or quantum computing. In this blog post, I want to give a short introduction to how we can use quantum computers for ML.
Where Could Quantum Computers Provide Advantages?
To date there are no clear applications in ML, where quantum computers could provide an advantage over classical ones. It is, however, possible to delineate some areas that could be promising.
Speed-Ups
The proposal of algorithms with significant quantum speed-ups is a key reason for why it has become so popular. These are, in particular, Shor's algorithm, that gives an exponential speed-up over the best known classical algorithm on integer factoring, as well as, Grover's algorithm, which gives a quadratic advantage on searching unordered lists.
Superposition and entanglement allow for what is most commonly referred to constructive and destructive interference, which is ultimately what allows for the speed-ups to happen. From another perspective, quantum computers provide a natural framework for manipulating information in high-dimensional vector spaces and Fourier-like representations, which many quantum algorithms exploit.
So-far, most quantum algorithms with exponential advantage have been reported for problems with an inherent algebraic structure. In particular, this is the case for the factoring algorithm, as well as its more general form of the Hidden Subgroup Problem. Such structure does, however, not come naturally for most ML problems we know from the classical world. Extensions the Hidden Subgroup Problem to learning problems have been explored, however, learning variants break down if one is not careful with the incomplete problem description [2].
One aspect for speed-ups that is often highlighted, however, is that due to the substantial overheads associated with quantum computation, practical quantum advantages will likely require more than modest polynomial speed-ups (super-quadratic [3]) in many applications. It is important not to overlook the costs of data loading, which is also why it is often argued to look at small-data problems [3].
Solving problems that are hard for classical computers
A different route to finding quantum advantage in ML would be to identify tasks hat are difficult to solve for classical ML, but come naturally to quantum computers. While it is generally not believed (though unproven) that quantum computers will solve NP-hard problems, that are a lot of problems that do not yet have an efficient classical algorithm associated with them. This is the case for integer factoring for example.
Things do however get tedious, when we look at such problems in ML. Modern ML methods are very complex systems, and their theory is not yet well understood, which makes analyzing their limitations beyond experimental benchmarking very difficult. Researchers have thus looked at problems that are related to Shor factoring to show robust results [4], however, it is difficult to find practically relevant problems for such algorithms afterward.
[1] https://www.liquid.ai/. Accessed 13.05.2026
[2] Wakeham, David, and Maria Schuld. "Inference, interference and invariance: How the Quantum Fourier Transform can help to learn from data." arXiv preprint arXiv:2409.00172 (2024).
[3] Hoefler, Torsten, Thomas Häner, and Matthias Troyer. "Disentangling hype from practicality: On realistically achieving quantum advantage." Communications of the ACM 66.5 (2023): 82-87.
[4] Liu, Yunchao, Srinivasan Arunachalam, and Kristan Temme. "A rigorous and robust quantum speed-up in supervised machine learning." Nature physics 17.9 (2021): 1013-1017.
Blog Picture by AcatXlo: https://pixabay.com/illustrations/circuit-hexagonal-geometric-pattern-7…