The cybersecurity profession has long since acknowledged that people are a crucial factor in any organisation’s security risk profile. As a result, cybersecurity awareness training is a growing slice of cybersecurity expenditure: the UK’s Cabinet Office, for example, reportedly spent £274,000 on security-related training last year, up 483% from 2019.
But there are doubts about how useful mere ‘awareness’ is in reducing cybersecurity risk, as awareness alone does not necessarily lead to safer behaviours. Applying the science of behaviour change can help organisations modify what employees do, not just what they know, and therefore measurably reduce risk, argues Oz Alashe, CEO of CybSafe.
Alashe’s interest in the human side of security was ignited during his time in the military, where his focus was on counter-terrorism and national security. “We needed to find some unpleasant people and to dissuade them from doing unpleasant things,” the former UK Special Forces Lieutenant Colonel recalls. “I got a very good understanding of how we identify people who don’t want to be found, physically or digitally.”
At CybSafe, Alashe’s focus has moved to encouraging ordinary people to adopt safe cybersecurity behaviours. The company was founded in response to an apparent question, he says. “If training is so important, why are so many people making mistakes and falling victim to cybercrime? And how do we explain these mistakes? The answer is behavioural science.”
While most cybersecurity training focuses on awareness, Cybsafe’s work builds on insight drawn from research into behaviour change, such as the COM-B model. This proposes that capabilities, opportunities and motivation (COM) all feed into an individual’s behaviour, Alashe explains.
In the context of cybersecurity, ‘capability’ refers not just to the knowledge required to perform some behaviour, but the tools, the time, or the understanding. “You’d be amazed at how many times [cybersecurity professionals] ask people to do things they can’t actually do.”
‘Opportunity’ refers to the way in which an individual’s context supports the desired behaviour. “The environment plays a much bigger role in behaviour change than security professionals give it credit for,” says Alashe. “We’ve seen a 67% improvement in the number of users who conduct a positive cybersecurity behaviour – changing the default password on their WiFi router for example – if we allow them to remind themselves at the right time.” The ‘M’, for motivation, is the area most familiar to cybersecurity professionals, Alashe adds.
Behavioural science and cybersecurity: putting it into practice
CybSafe puts this model into practice with a software platform that delivers behavioural ‘nudges’ for secure behaviours, then analyses their effectiveness. That analysis, backed by behavioural research, is what distinguishes its offering, Alashe says. “In our view, there’s a difference between a typical prompt or alert and an evidence-based nudge. [We ask] what is the evidence for that kind of prompt, and if there isn’t evidence, how do we collect that evidence to determine whether it achieves what we want it to achieve?”
One insight to have emerged from this analysis is that an individual’s confidence in their understanding of a security principle is an important determinant of behaviour change. “In security compliance training, people are often given some information then, a few seconds later, they’re asked if they remember it. Most people are intelligent enough to remember that information 90 seconds later but it doesn’t really indicate whether you’ve reduced the risky behaviour,” explains Alashe. “But assessing their confidence can help.”
So do cybersecurity professionals need to become armchair behaviour scientists? Not so, says Alashe. “What we need is for them to ask questions, to say ‘show me the evidence that this [training] is working’.”
A lesson that cybersecurity professionals can draw from behavioural science, though, is to focus less on the individual and more on the context in which they are working. “There’s a common attitude among cybersecurity professionals, that ‘people are the weakest link’, that’s actually quite unhelpful,” he says. “If people are not doing what you want them to do, it might be that your policies are not that helpful, or that you’ve given them tools that require them to make several steps more than they would have done before.”
Indeed, this is the essence of ‘people-centric security’, Alashe argues. “It doesn’t mean making people the focus of your attention. People-centric security means recognising that people are part of the system, and that security that doesn’t work for people doesn’t work.”
Oz Alashe will be speaking on the ransomware panel at next month’s Black Tech Fest virtual festival, of which Tech Monitor is a media partner. Click here for more information.