AI has been flagged as a strategic risk to the UK in a new report published by the government today. It is the first time the technology has been featured in the National Risk Register, which documents a variety of threats facing the nation, but ministers have been accused of not knowing how to respond to the dangers posed by automated systems. Cyberattacks against the UK’s critical national infrastructure have also been pinpointed as a potential problem.
Released every two years since 2008, the latest edition of the National Risk Register was launched by Deputy Prime Minister Oliver Dowden during a visit to the installation of Dogger Bank offshore wind farm in the North Sea.
UK’s National Risk Register unveiled
The risks of AI are described as having a number of implications “spanning chronic and acute”, causing an increase in harmful misinformation or a reduction of economic competitiveness. The government says this is driving its agenda to boost AI safety, including its plan to hold a global summit on the topic.
Elsewhere, several scenarios have been set out in the report outlining how a cyberattack on an element of critical national infrastructure, deemed a moderate risk in the report, may take place. It warns that an attack against electricity infrastructure may cause a total failure of the national electricity transmission system. A cyberattack on civil nuclear power could trigger the shutting down of nuclear sites and an online attack on gas infrastructure could lead to “casualties and fatalities as a result of a lack of heating”, as well as affecting the supply of energy for up to three months.
Other cyber risks are included, such as the risks to health and social care, the transport sector and telecoms systems, which could affect “millions of customers”, the NRR says.
Malicious drone attacks are also cited as a risk. Such incidents at airports could have disastrous consequences, says the NRR, continuing that should this attack take place, “specialised police counter-drones” could be deployed in response to the incident.
A possible lack of detail in the report
However, the lack of detail included in the evaluation of the risks of AI could be a concern. Labour MP Darren Jones tweeted that AI is “barely mentioned” in the report and that ministers “don’t know what to do” about it. He called on Prime Minister Rishi Sunak to set up an AI sub-committee of the national security council to track the risk posed by the technology.
So what should the Government do next on AI risk?
— Darren Jones MP (@darrenpjones) August 3, 2023
The PM should set up an AI sub-committee of the national security council.
The G7 Hiroshima process should work with tech companies to accelerate AI safety tech investment.
👇https://t.co/IdBTZV1L0o
Dowden argues that the latest NRR is the most comprehensive risk assessment ever published. He says “the government and our partners can put robust plans in place and be ready for anything”.
This is critical to national safety, added Matt Collins, the deputy national security adviser. “This edition of the NRR, based on the government’s internal, classified risk assessment offers even more detail on the potential scenarios, response and recovery options relating to the risks facing the UK, ranging from terrorism to conflicts and natural disasters,” he said.