Energy companies have had a lot to worry about this past year. The economic disruption caused by Russia’s ongoing war in Ukraine has amplified calls for an accelerated energy transition in Europe — a shift that could offer the combined benefits of alleviating the continent’s dependence on Russia and cutting down on its use of highly-polluting fossil fuels.
In order to keep global warming under 1.5°C — the stated goal of the 2015 Paris Climate Accords — carbon emissions will need to reach net-zero by 2050, according to the UN. Added to this challenge? Global power consumption is expected to triple by 2050 as living standards grow worldwide, according to a 2022 report by McKinsey.
In their efforts to achieve these complex — and apparently conflicting — objectives, energy companies such as E.ON SE, Eni, and EDF are turning to quantum computing. They’re hoping to harness the technology’s immense potential processing power to deliver more energy more efficiently, thus maintaining a stable and sustainable supply of electricity for the years to come.
Quantum optimisation is well-suited to the energy sector
One way to help get to net zero is by decentralising energy supply networks, a trend that’s happening across Europe. Every day, companies are adding more small-scale generative units, such as wind turbines and household solar panels, into their power grids. They’re also connecting a host of EV charging points, which feed batteries that can store power until it’s needed.
These decentralised grids tend to be more efficient than traditional networks since the sources are often closer to the users, meaning less energy is lost during transmission. They’re also better at linking small-scale renewable sources, rather than depending exclusively on large-scale plants, and offer greater control over the entire network, helping companies cope with fluctuating demand.
The perks are numerous. Managing decentralised grids, however, is incredibly complex since companies have to make real-time decisions that factor in a huge range of variables. This leaves the energy industry facing “very difficult mathematical problems,” explains Heike Riel, a researcher at IBM. Such problems might, indeed, turn out to be too tough for classical supercomputers to handle — meaning energy suppliers are being forced to rely on approximations that, given their margins of error, don’t offer maximum efficiency.
But it’s not just supply that needs to be optimised on decentralised grids. The tools we use to produce and transmit our energy all need to be maintained over their lifecycle for regular wear-and-tear, as well as any damage caused by freak weather or felled trees. This, too, requires complicated scheduling optimisation — both for the maintenance work itself and the teams of workers that perform it. “You’re balancing having lots of extra spare parts, which is basically reducing your efficiency, versus running out, which is going to bring problems in terms of operating capacity,” explains Murray Thom, vice-president at quantum computing company D-Wave Systems.
There’s also the question of setting up new generators. When planning fresh developments, companies need to consider everything from local weather conditions to demand, grid constraints, supply chain challenges, transportation costs, and employee availability. Renewables, like wind turbines, are particularly vulnerable to the vagaries of their environment, so such calculations will only grow more prescient — and more complicated — as companies shift to sustainable resources.
Indeed, researchers from Microsoft announced, all the way back in 2018, that they had developed a new quantum-inspired algorithm for so-called ‘unit commitment’ – identifying the best power-producing resources to activate based on forecasted demand, efficiencies, and capacity limitations. This tool already outpaced more powerful classical systems in a demonstration – and will likely show a far bigger advantage when scaled-up quantum computers become commercially available.
These aren’t new dilemmas. Such problem-solving has previously been handled – albeit more slowly and less precisely – by large, expensive data centres that burn through lots of energy. But quantum could, researchers believe, be faster, more accurate, and more cost-effective.
It could also be a research accelerator, explains Riel. Scientists have, for example, long struggled to improve batteries’ capacity to store energy. This is tough to investigate experimentally, explains Riel, because there are so many complex chemical reactions involved. But if we can precisely predict those reactions using the processing powers of a quantum computer, we might be able to boost batteries’ capacities and save precious energy.
Energy companies and quantum
Multiple energy companies have begun investing in quantum research – lured by the pressures of sustainable development and the potential to win big economic gains.
German energy giant E.ON is trying to use IBM’s quantum capabilities to optimise the output of its decentralised power infrastructure. The partnership, announced in 2021, gives E.ON access to IBM’s quantum computing systems, via the IBM Cloud, as well as IBM’s expertise and Qiskit quantum software developer tools. “For E.ON, the innovative use of quantum computing offers an opportunity to solve complex and cross-system optimisation tasks in the energy transition in an innovative way,” said Victoria Ossadnik, who was E.ON’s chief digital officer at the time.
French energy supplier EDF, meanwhile, partnered with startup Quandela to study the use of photonic quantum computing in simulations of deformations in hydroelectric dams. This research aims to boost the speed and accuracy of these simulations and thus enable better design and maintenance of these flexible energy sources.
In November 2022, Italian energy company Eni teamed up with Paris-based quantum computing startup Pasqual in its mission to use quantum technologies “to solve some of the most advanced computing problems, currently not approachable even by supercomputers,” a spokesperson told Tech Monitor. They’re investigating both quantum optimisation and quantum machine learning — aiming to use these tools for everything from simulating reservoirs to studying magnetic fusion.
This kind of development, as evidenced by these partnerships, thrives on collaboration. “The value of industry partners and larger businesses driving the quantum industry forward together should not be underestimated,” says Daniel Goldsmith, senior quantum technologist at UK innovation agency Digital Catapult. “For startups in the quantum space to succeed and effectively launch their solutions, industry partners must be involved in a collaborative process that enables the development of new technologies to solve broader industrial challenges.”
So what’s stopping everyone from getting onboard? It might still be too soon for many companies to see — and believe — the huge rewards that quantum researchers are promising. “Right now, we are in what analysts call the Noisy Intermediate Scale Quantum (NISQ) era of quantum computing,” says Goldsmith. “This is where devices are small, qubits are prone to error, and proof of concepts don’t yet have the success to achieve wide-scale business adoption.”
Some people also just don’t know what quantum’s all about, explains Thom. “There isn’t enough awareness because the technology is emerging really, really quickly,” he says. “A lot of folks that I’m talking to are kind of like, ‘what is a quantum computer?’”. Energy companies in particular might be more conservative in their technological experimentation, says Thom, because of the mission-critical nature of their work.
Capacity might not be ready yet, says Riel, but it’s not too soon to start looking into problems that could be solved by quantum computers. Otherwise, she warns, companies risk being left in the dust by their tech-savvy competitors. Quantum, with all its radical promises, isn’t going away any time soon. “We are on a steep trajectory,” says Riel. “It’s time to get your feet wet.”