Google claims quantum advantage for machine learning
Google’s AI researchers claim they’ve achieved quantum advantage – meaning when a quantum computer can perform tasks that are not possible with a classical machine – in the field of machine learning.
The milestone was reached on Google’s quantum computer Sycamore, the researchers said, which completed a series of learning tasks using a quantum learning algorithm that analyzes the output of quantum sensors.
“Unlike previous quantum advantage demonstrations, no advances in classical computing power could overcome this gap,” the researchers who carried out the experiment wrote. “This is the first demonstration of a provable exponential advantage in learning about quantum systems that is robust even on today’s noisy hardware.”
A classical machine learning system would not be able to directly access and learn from quantum information, the researchers says, whereas the quantum learning agent can access and directly interact with the quantum data, providing a level of analysis not possible in any other way.
How Google’s quantum computer outperforms classical machines
In a new paper published in Science, the team demonstrated their quantum learning agent was able to also perform exponentially better than classical machine learning on other tasks not involving quantum data.
“Quantum computers will likely offer exponential improvements over classical systems for certain problems, but to realize their potential, researchers first need to scale-up the number of qubits and to improve quantum error correction,” the authors explained.
But even the current-generation noisy quantum computers are a vast improvement on classical machines when it comes to analyzing data from quantum sensors, they add. These sensors are already widely used for high-precision measurements, and exploit correlations between particles to extract more information about a system than would be available otherwise. They can be deployed for environmental mapping and to measure things such as magnetic fields.
Content from our partners
The practical implementation of quantum machine learning
Even without harnessing the potential of quantum sensors, the Google team found their quantum computer was “exponentially better” at analyzing traditional data than a classical machine.
“This experimental work represents the first demonstrated exponential advantage in quantum machine learning,” the authors wrote. “This type of quantum learning advantage cannot be challenged, even by unlimited classical computing resources.”
While Google’s results were achieved in lab conditions, businesses are already studying ways quantum machine learning techniques could be deployed in the real world. Speaking at the AI Summit in London earlier this month, Dimitrios Emmanoulopoulos, lead data scientist at Barclays, said there were already several realistic use cases for quantum machine learning in financial services, including fraud detection.
“Quantum computers are expected to offer, for certain tasks, exponentially faster computational times than classical processors,” Emmanoulopoulos said. “Currently, training state-of-the-art machine learning models is computationally very expensive, and even with the latest hardware the training times can reach up to several weeks.”
Emmanoulopoulos said that quantum neural networks running on current quantum computing technology can already be trained significantly faster than a classical machine learning model, and that training times are likely to drop further as more advanced quantum machines hit the market.