Technologies that monitor the brain are in danger of being misused during the recruitment process and in the workplace, data watchdog the Information Commissioner’s Office (ICO) has warned. Growing popularity of “neurotech” in the UK private sector has prompted the ICO to release a stark warning of the potential for misuse and data bias within the gathering of neurological data.
The ICO says that the use of technology to monitor neurodata will become widespread within the next decade.
Regulate brain monitoring technology in the workplace
The ICO released a report today outlining fears that the boom in interest in neurotechnology and the gathering of neurodata may be misused unless appropriately regulated.
There is clear evidence of increasing interest in the UK private sector, explains the watchdog, with some 34 companies focusing on the industry.
Currently, the deployment of neurotechnology is mainly focused on the medical sector, where there are strict regulations. Scientists have been working to help patients to overcome neurological impairments with increasingly complex invasive and non-invasive devices, such as brain implants and wristband-based neural interfaces.
It can predict, diagnose, and treat complex physical and mental illnesses, transforming a person’s responses to illnesses such as dementia and Parkinson’s disease. In May, Gert-Jan Oskam, a 40-year-old Dutch man who was paralysed in a cycling accident, was able to walk again thanks to electronic implants in his brain.
Private sector companies are also starting to work with the technology. Start-ups like Kernel are in development that can be used to “read and write” long-term memories directly from the brain, while Elon Musk’s brain implant company Neuralink received regulatory approval this month to conduct its first clinical trials on humans, raising some alarm due to animal cruelty allegations levied at the company in its previous research stages.
Will other sectors utilise brain monitoring?
Sectors like recruitment are starting to consider different uses for neurodata. The ICO has flagged this as a possible cause for concern as individuals do not have control over the data they exhibit should they be monitored, and so may be discriminated against within a professional setting.
For example, workplaces could see increased use of neurodata recording techniques as part of the recruitment process. This will aid organisations that want to identify people who fit desirable patterns of behaviour or perceived traits.
“Research that combines biometric measures and organisational psychology has been called by some ‘neuro management’,” explains the ICO.
There are also concerns that conclusions drawn from neurodata may be based on highly contested definitions and scientific analysis of traits, embedding systemic bias in the processing and potentially discriminating against those who are neurodivergent.
“Finding an appropriate basis for processing is likely to be complex and organisations will need to consider fairness, transparency and data retention,” the report says.
Privacy of the subjects must be prioritised, explains Stephen Almond, executive director of regulatory risk at the ICO. “Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour,” Almond said.
“The consequences could be dire if these technologies are developed or deployed inappropriately. We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination.”