View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
March 17, 2017updated 20 Mar 2017 4:47pm

Are you Siri-ous? Why The Security Risks of Virtual Assistants Are No Laughing Matter

There are numerous security implications associated with virtual assistants, some of which are only just starting to come to light.

By James Nunns

Virtual assistants are everywhere nowadays, whether they’re embedded in phones or used in the home, by both adults and children. As technology progresses, virtual

Jaroslaw Czaja, CEO of Future Processing.

assistants are being introduced to an increasing amount of devices or vehicles, most recently with gadgets like Amazon Echo that are promoted as an assistant that sits in the background and listens out for voice commands.

However, there are numerous security implications associated with virtual assistants, some of which are only just starting to come to light. The security of virtual helpers was the focal point of a recent criminal case in America, where the police wanted to extract the information that was heard over an Amazon Echo during a murder. Despite the device acting as a ‘witness’ in the investigation, the case proved that virtual assistant gadgets can retain information that they hear and can replay speech or captured sound to a third party at a later date.

Whilst virtual assistants are now an addition to many homes and workplaces around the globe, users should be aware of how to deal with potential security issues and understand the consequences not just of actions, but of words. So, what are the security risks of using virtual assistants, or even being around them while you are talking?

You’re talking – are they listening?

It is now commonplace to have conversations around IoT-enabled devices without thinking twice. Conversing with virtual assistant applications (like Siri or Cortana) to receive information has also become part of an everyday routine for some of us. Virtual assistants are usually activated via a voice command (typically the assistant’s name or a stock phrase), but what happens when they aren’t activated? Is it possible that they are still listening to everything that is said in the background, or have they shut off completely?

The possibility of virtual assistants listening while they are seemingly deactivated is not inconsequential. It means that they can gather information and record your speech without you being aware that it’s happening. This collated information could either be used to show personalised adverts, or – more worryingly – stored to be accessed and replayed to a third party at a later date. Devices that do the latter are undoubtedly a security threat to both individuals and companies, as the device will have a log of everything that is said around it.

Chinese whispers turns cybercrime

A virtual assistant’s ability to listen to speech and reply accordingly is undoubtedly very smart. However, a machine’s capability to listen and exhibit a response can also be very limited as it doesn’t have a human’s ability to apply context to the meaning of words. For example, if someone was to say a harmless sentence, the virtual assistant may misunderstand what that sentence means. The context of a sentence is unable to be processed by machines, which limits their understanding and can be twisted into threats, crimes, or plotting if a phrase or word is misunderstood. Even users who make a joking request of their virtual assistant may unwittingly be creating a permanent log of ‘evidence’.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Conversations in the clouds

With the exception of planned recordings, it isn’t unreasonable to expect that your words aren’t stored anywhere, especially somewhere that you cannot access. Many people may find it troubling that when virtual assistants are listening, whether active or not, those words can be stored and automatically sent to the Cloud. Once the information is in the Cloud, you won’t have control over who has access to it or how it is used, especially if it is hacked or accessed remotely. Wider access to context-less speech increases the chance of misinterpretation at some stage, a third party can replay information in the Cloud and misunderstand a sentence or attach significance where none was intended.

Siri goes shopping

Of course, there are also monetary issues with virtual assistants, since some – like Amazon Echo’s Alexa – have access to your bank details and can automatically order goods via speech and voice commands. This can either be a harmless inconvenience, like a news anchor asking Alexa to buy a dollhouse, or much more sinister. For example, if someone was to ask the device to order incriminating goods, the owner would be at risk of prosecution. The financial risk therefore extends to both losing money and buying unwanted goods that can implicate the owner of the virtual assistant.

Thankfully, these security risks can be avoided by simply turning off the microphone or the purchasing feature on any IoT-enabled devices, something that’s easily done on the device. But with some virtual assistants specifically created for their automatic voice control – like Amazon’s Alexa – extra care must be taken while using them to avoid these security risks. Siri and other virtual assistants can be your aid in most instances, but sometimes you might need to remember to switch it off.

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU