From getting shut down in a live interview for failing to answer a repeated question, to admitting he shared a tryst with a married colleague in his previous job as a fireplace salesman, British defence secretary Gavin Williamson has attracted ample publicity in recent months; not all of it favourable.

But his interruption by Apple’s virtual assistant Siri today while addressing Parliament as part of a counter-Daesh” update, left security professionals astounded.

Interrupting the Defence Secretary mid-flow, his phone, picking up on a verbal queue, today piped up: “Hi Gavin, I found something on the web for: ‘In Syria, democratic forces supported by coalition…”

“What a very rum business that is” the Defence Secretary said, amid perplexed laughter. “It’s very rare that you are heckled by your own mobile phone.”

“Inexcusable”

Rodolfo Rosini, the information security entrepreneur and now partner at  Zeroth.ai, the Hong Kong-based AI and machine learning-focused accelerator, was not amused.

He told Computer Business Review: “In itself leaving Siri on is not a huge risk. What is shows [though] is that the guy has no OPSEC [operational security] so if he sets Siri on always listening, he may have **** security on his home computers, download dodgy apps etc. It’s inexcusable for someone in his position. Basically the problem is that he signalled he is an easy target with no clue.”

He also suggested that the Secretary of State may not have gone to GCHQ to secure all his devices, “because they would have disabled the ‘always on’ mic.”

The Defence Secretary’s office has been contacted for comment on his security arrangements.

His responsibilities include nuclear operations,strategic operations and operational strategy, including membership of the National Security Council, along with defence planning, programme and resource allocation and strategic international partnerships with both nation states and organisations like NATO.

The incident comes after Chinese researchers discovered a way to hijack smart assistants like Apple’s Siri using sounds inaudible to the human ear, raising security concerns about the voice-activated devices. The hack was created by Guoming Zhang, Chen Yan and colleagues at Zhejiang University in China.

Using ultrasound, an inaudible command can be used to wake the assistant, giving the attacker control of it as well as access to any connected systems.

See also: Are you Siri-ous? Why The Security Risks of Virtual Assistants Are No Laughing Matter