Smart home devices can be used to facilitate domestic abuse, a researcher told a House of Commons committee today. Dr Leonie Tanczer, lecturer in international security and emerging technologies at University College London (UCL), gave examples of abusers remotely manipulating their victims’ home environment as a form of abuse, as MPs heard evidence on the impact of ‘connected technology’.
Smart home technology and domestic abuse
Tanczer, who leads the ‘Gender and IoT’ research project at UCL, told the Digital, Culture, Media and Sport committee that smart home devices are extending the reach of domestic abusers. “You no longer need to be physically present with someone to impact them,” she warned.
The UCL researcher gave examples of abusers manipulating their victim’s physical environment using smart devices, such as changing the heating in a home or using smart blinds, as a form of abuse. “If you’re at home and worried about things and your partner tells you they have hacker friends or abilities they don’t have, you really are in an environment where people are becoming fearful,” she said.
This is just one of the potential harms of connected technology, Tanczer told MPs. Others include the privacy risks to children using educational apps. These apps collect a vast amount of data from the users, she said, including personal and behavioural information, but children are often not given a choice to opt out of sharing their data with third parties. “Most education technologies do not have the opt-out option, so you either participate in education or you do not,” Tanczer said.
“People underestimate how much [these devices] can actually collect and how much impact the privacy, security and safety risks have,” Tanczer warned, saying that more digital literacy and education is needed.
‘Smart is synonymous with surveillance’
This view was echoed by Silkie Carlo, director of privacy campaign group Big Brother Watch. “A lot of the time, the word smart is synonymous with surveillance – we’re talking about technologies that collect data on individuals and often that is for commercial purposes,” Carlo said. “But with that comes the danger of data being used for exploitive or abusive uses as well as criminal activity.”
Carlo referred to the use of live facial recognition by the Metropolitan Police earlier this month. Disclosures by London’s police force reveal that on Saturday 16th July, it used LFR outside Oxford Circus to scan the faces of over 36,000 passers-by, the only outcome of which was a single false identification (a separate deployment of LFR earlier in the month led to three arrests, the Met’s figures also reveal).
This is an indication of “increasingly ambient surveillance,” Carlo argued. “[Live facial recognition technology] is something that has completely evaded parliamentary scrutiny,” she added.
Data harvesting can be used for social good
Also giving evidence was Antony Walker, deputy CEO of techUK, who told the committee that smart devices could be used to help people live independently and that data harvesting shouldn’t be assumed to always be bad. “It’s a mistake to assume data harvesting in itself is a bad thing,” he explained.
Walker said that positive outcomes could be achieved if data was collected ethically. One example he gave was in education, where data could be used to understand how to meet the needs of the children using the applications of services, providing a more personalised learning experience.
However, he echoed his fellow panellists in saying that the government and the tech industry need to focus on what data is being used for and the purposes behind it to enable these possibilities.
The DCMS committee is conducting an inquiry into the impact of the prevalence of smart and connected technology and what needs to be done to ensure its safe and secure for end-users. As part of its examination, the committee will explore how devices such as smart speakers, virtual assistants such as Alexa and Siri and wearable technologies are affecting life in the home, in the workplace, and towns and cities.
It is also looking to see whether existing legislation is sufficient to cope with the increasing use of connected technology. The inquiry launched on 22 May 2022 and closed evidence submissions on 23 June.