As the digital and real worlds become ever-more entwined, do we really understand how people and technology interact? In this blog, republished from our OnDigitalTrust publication, Prof Gerard Hodgkinson explains why both person and machine should be considered when introducing and safeguarding digital infrastructure.
- Historically, new technologies fail to meet expectations because human behaviour is not considered
- Security systems are only as good as the people who use them, with employees often creating ‘work arounds’ to bypass difficult security procedures
- Policymakers need a better understanding of how they can promote digital security, through techniques such as education and ‘nudge theory’
We live in a digital age. Along with vast and varied benefits, this revolution has brought equally vast and growing risks. From data mining and fraud, to activity that could close a company or even put national security and lives in danger, mitigating these risks is an extremely high priority. But while billions are spent on technological fixes, are we failing to see – and more importantly act – on the psychological, social and cultural factors at play?
A growing problem
An alarming increase in the number of high-profile cyber security breaches shows not just the growth of the problem of cyber crime and hacking, but also the vulnerability of systems that ought to be highly secure. Take the well-documented case of Gary McKinnon, the Scottish UFO enthusiast who hacked into US Military and NASA computers and ended up in a high-profile extradition case, indicted by a federal grand jury on seven counts of computer-related crime. Or the Windows-based computer worm known as ‘Stuxnet’, which was implicated in the temporary shutdown of Iran’s uranium enrichment program observed by UN inspectors in November 2010. It is all too apparent that much of the world’s infrastructure and associated data systems are highly vulnerable and inadequately protected. Governments, hospitals, financial institutions, educational establishments, transportation systems, professional services providers, and manufacturing installations alike, indeed, all sectors are susceptible. So what is being done?
In an attempt to safeguard their hardware, software, and data many organisations have sought technical solutions, investing increasingly large sums of money in enhanced antivirus software, elaborate authentication procedures, and associated encryption systems. However, cases such as the ones highlighted above illustrate the ease with which people intent on this kind of activity are often able to overcome such technical ‘fixes’, no matter how elaborate and seemingly impenetrable.
Learning from history
As ever, we can learn from the mistakes of the past. Many new technologies have been introduced and failed to perform as expected because of a failure to consider human behaviour. A well-known concept, known as ‘sociotechnical systems’, was pioneered as long ago as the 1950s when researchers looked at why new mechanised ways for extracting coal more quickly, and safely, had not reaped the expected increase in productivity. The reason? This new technology had been introduced without anyone considering that it would dramatically reduce and alter social interaction between the coal miners. The designers of this new technology had unintentionally eroded some of the social psychological benefits of the traditional mining techniques that their innovative machinery replaced.
A complex blend of human and machine
It is clear then that digital technologies, like all technologies, are only one half of a more complex system, the technological elements being inextricably intertwined with the human beings who make use of them. Safeguarding the digital infrastructure at the heart of the world’s economy, therefore, demands a sociotechnical approach, which seeks symbiotic solutions that blend both technological and human insights. The absence of such cross-disciplinary thinking in organisations risks the development of solutions that place unrealistic demands on employees, leading in turn to ‘work arounds’ (I’ll just ‘work around’ this) to bypass the burdens of compliance.
If people find a potentially secure system difficult, inevitably, they will be motivated to find such workarounds; all too often, for example, they ‘work around’ the need to set passwords, thus posing a massive threat to computer security. Gary McKinnon was able to hack into 97 US Military and NASA computers, using nothing more sophisticated than a simple script that could search 65,000 computers for blank passwords in less than eight minutes. Having accessed highly sensitive information through this surprisingly straightforward approach, he then deleted critical files from an assortment of operating systems. The US authorities claimed that his actions rendered the Army’s Military District of Washington network of 2,000 computers inoperable for 24 hours. The clean-up operations are estimated to have cost around $800,000, and this figure doesn’t include the costs, paid by US and UK taxpayers, of nine years of legal procedures.
Redressing the imbalance
The research my colleagues and I are undertaking here at The University of Manchester is seeking to redress the fundamental imbalance of understanding between the human elements and the technological elements underpinning cyber crime, to find more effective ways of addressing it. Drawing on the insights of anthropology, behavioural science, criminology, and sociology, among other specialist fields, we are working closely with computer scientists, in a wide-ranging programme of multidisciplinary work to augment, and in some cases challenge, some of the more conventional technical approaches designed to enhance digital security and thwart the efforts of cyber criminals.
There are many ideas and areas for further research and exploration. For example, nudge theory in behavioural economics – making it easy for people to embark on desired courses of action – has helped shape all kinds of behavioural change, so why aren’t we applying it more often to enhance digital security, one of the most pressing challenges of our times? By designing simple behavioural routines that fit in easily with people’s everyday task environments, and avoiding anything too onerous, we will get results that are much more effective.
We also need to learn more about how culture variously gets in the way of or helps to promote digital security and trust. To illustrate, one organisation I have been working with was blissfully unaware that its team of cyber security experts were perceived by many of its managers and front-line employees as an outsider group, ‘geeks’ who lacked sufficient understanding of the day-today realities of people’s jobs. The everyday practices and artefacts displayed around the organisation’s premises reinforced an overwhelming sense that cyber security was a lower priority than the many other issues it was attending to, resulting in cynicism towards the cyber security team and the policies it enacted. Not surprisingly, cyber security work arounds were commonplace.
Effective communication and education are key. Some of my own research, published more than a decade ago, found that most people could identify risk scenarios of varying seriousness in terms of their consequences or likelihood of taking place. However, serious ‘meltdown’ scenarios were typically considered very unlikely, whereas scenarios expected to be trivial in their effects were considered very likely. A decade on, it’s sobering to reflect on the numerous examples of real-life cyber security breaches that closely match the more serious and consequential ones that our study participants identified as low-frequency, low-likelihood events. Of course, the actual frequency of such events is difficult to measure objectively, not least because organisations don’t like to share how close they have come to a major catastrophe or how they have managed to get out of one, so the true scale of this issue is not public knowledge.
There is much work to do to. But in the meantime, there are some simple, practical ways for using behavioural and social science insights to enhance digital security and mitigate the risks associated with cyber crime in the workplace and beyond:
- Consider not just the technology but also how people will interact with it. These are inseparable components of one, sociotechnical system. If either of these essential components fails, then the whole system fails.
- Consult with and incorporate the views of users. This not only has a direct impact on whether people fail to comply (leaving organisations vulnerable); it also has a material bearing on job satisfaction and productivity.
- Make it easy for people to do the right thing – consider how nudge theory and other behavioural science insights and techniques could support your plans.
- Consider how you will communicate why particular cyber security solutions are being implemented and do not assume employees have the requisite prior knowledge. Most people do not have a clear sense of where the risks lie, what those risks really are, and how their behaviour fits into the bigger picture.
In the final analysis no amount of investment can ever eradicate the growing security threats confronting the digital economy. Organisations need to develop routines that assure they maintain situational awareness and adapt their mitigation strategies as threats evolve.
Investment needs to shift towards more social and behavioural science-informed approaches. Governments, businesses and other sorts of organisations need to be better educated on how attitudes and behaviour can enhance, or undermine, efforts to promote digital security and prevent the growing threat of cyber crime.