IBM claims its new 127 qubit quantum computer is able to produce accurate results at scale beyond those possible with classical computing. Put up against an advanced classical supercomputer on a problem aimed at simulation properties of matter it continued to produce accurate results as the problem became more complex even as the supercomputer began to falter. Big Blue claims this is the start of the era of quantum utility.

IBM says the quantum results with error mitigation would continue to produce answers long after classical approximation methods faltered (Image: IBM Quantum)
IBM says the quantum results with error mitigation would continue to produce answers long after classical approximation methods faltered (Image: IBM Quantum)

Current generation quantum hardware is noisy and error prone. The nature of the way it operates produces faults that have to be corrected or mitigated before it can perform useful calculations. The challenge faced by quantum engineers is finding ways to do this faster than new errors can accumulate.

In a new experiment, IBM researchers were able to perform fast error mitigation in the 127-qubit Eagle quantum processor to accurately predict the properties of a material. The results were verified by the University of California, Berkeley on an advanced supercomputer.

The findings were published in the scientific journal Nature and were carried out on supercomputers at the Lawrence Berkeley National Lab and another at Purdue University. IBM’s Eagle returned accurate answers ever time regardless of how complex the computations became. The team are confident these results were still accurate despite not being able to check them on classical hardware. IBM Quantum scientist Andrew Eddins said: “The level of agreement between the quantum and classical computations on such large problems was pretty suspiring to me personally.”

IBM warned that this isn’t a claim today’s quantum computers exceed all abilities of classical computers. Other classical methods and specialised machines could return correct answers for the same calculations. It is more that its proof of quantum utility, that it provides users with “confidence in the abilities of near-term quantum computers.”

Being able to model and simulate components of materials that classical computers can’t do efficiently is one of the ultimate goals of quantum computers. It has been long held as the point of quantum advantage, although the understanding of what “advantage” means has shifted over time due to improvements in error mitigation. 

The company says achieving faster material property modeling will lead to more efficient fertilisers, better batteries and new types of medicines. This is due to the number of parameters and data points involved in understanding the nature of the material. Too many for a classical machine to iterate through in any meaningful time period – but potentially easy for quantum hardware.

Quantum computers becoming ‘capable scientific tools’ – IBM

It has been long assumed thousands of qubits would be needed to achieve any meaningful advantage, but IBM says this new work makes it possible to outperform classical simulations with 127 qubits. This is because the QPU learns and mitigates errors in the system.

“This is the first time we have seen quantum computers accurately model a physical system in nature beyond leading classical approaches,” said Darío Gil, senior vice president and director of IBM Research. “To us, this milestone is a significant step in proving that today’s quantum computers are capable, scientific tools that can be used to model problems that are extremely difficult – and perhaps impossible – for classical systems, signalling that we are now entering a new era of utility for quantum computing.”

IBM has been taking a forked approach to quantum processor development, working on different streams with a range of error mitigation techniques. This new version will now be deployed to the IBM Quantum Cloud and partner locations. This will include the recently announced European quantum data centre in Germany.

“As we progress our mission to bring useful quantum computing to the world, we have solid evidence of the cornerstones needed to explore an entirely new class of computational problems,” said Jay Gambetta, IBM fellow and vice president at IBM Quantum. “By equipping our IBM Quantum systems with processors capable of utility scale, we are inviting our clients, partners and collaborators to bring their hardest problems to explore the limits of today’s quantum systems and to begin extracting real value.”

The company says its new generation of QPUs open the door to “utility scale” standards of quantum computing that can surpass classical methods for certain applications. It added that this will only improve as error mitigation techniques continue to advance.

The initial focus will be on reaching quantum value across a range of sectors including healthcare and life sciences. This work will be led by groups including the Cleveland Clinic and Moderna with research into quantum chemistry and quantum machine learning to speed up molecular discovery and patient risk predictions.

IBM says CERN and DESY will use the new processors to work on determining the best quantum calculations for fusion modeling and Boeing, Bosch and ExxonMobil will be among the groups exploring workflows for material simulations.

Read more: Can the UK become a quantum computing world leader?

Homepage image by Misha Friedman/Getty Images