Under circumstances when cyber security threats are constantly on the rise, many businesses are thinking about motivating their staff to care more about cyber security. Adversaries are becoming more and more inventive with the ways in which they approach social engineering and many attacks - particularly, phishing - become more and more sophisticated, inventively exploiting imperfections in human perceptions. This, in turn, costs businesses significant amount of money and time, as it takes more and more resources to restore "business as usual" state after cyber attacks. Therefore, most executives and managers are thinking of ways to work with human behaviour rather than with technological solutions.
Why Are Human-centered Solutions often Counter-productive?
One of the main reasons why businesses prefer technological (e.g., "zero trust") solutions is the fact that many human-centered interventions fail. Often the reason for this failure is the absence of the holistic approach to tackling human-driven cybersecurity risks. In fact, many companies concentrate on two measures: (i) providing information and (ii) ensuring compliance. In fact, staff compliance with the cybersecurity policies of the company is probably the most common way in which cyber security human-related issues are addressed within businesses. Many companies have a set of rules with regard to their computer and data systems. For example, while some companies are very relaxed about taking data outside the company premises, allowing employees to work from home (a very useful policy under COVID19 conditions); others are extremely cautious about this, requiring staff to use dedicated internal drives or internal clouds for all operations with data. Similarly, while some businesses allow free unencrypted communication, others heavily restrict the way in which communication happens and even insist on encrypting all externally faced emails.
Compliance Is Not Everything
With respect to compliance, three aspects are particularly worth mentioning: policies with regard to device ownership, USB port usage, and passwords. With regard to devices, businesses tend to operate one of two models: “bring your own device” or “in-house device” policy. “Bring your own device” implies that company employees are allowed to bring and use their own devices (laptops, iPads, etc.) to fulfil their duties; whereas an “in-house device” policy refers to a situation where each employee has to use devices and systems provided by the company. There are pros and cons associated with each model. Specifically, many large organisations (with several notable exceptions) prefer to have a single supplier of computers and limit themselves to one operational system (say, Microsoft Windows). This is primarily due to the fact that the majority of employees are not technically savvy and rely on IT services to manage and fix problems should any arise: be it an issue with a particular PC or with the digital security of the organisational system as a whole. This is why such systems are built with technical solutions in mind and are designed to minimise human interactions with the fragile segments or areas of a system’s elements and networks.
At the same time, the “bring your own device” policy is often exercised in smaller and more technically aware organisations. It is true that individual devices might have a higher probability of being compromised in those organisations compared to “in-house device” organisations; however, the fact that different employees operate not only different systems, but also different versions of those systems, makes it very difficult for adversaries to infiltrate such organisations. For example, many research institutes allow staff to bring their own devices to work. Under these circumstances, you are likely to see people coming to the office with different devices and working in a wide variety of operating systems. For example, the reason why it would be difficult to launch an effective cyberattack on the Alan Turing Institute, where I work, is that HP computers coexist with Macs on the same office floor at Turing; and operate not only different versions of Windows and MacOS but also different distributions of Linux. Therefore, even if adversaries compromise all users of Windows, users of other operating systems are likely to sustain the attack and Turing will be able to quickly recover. Yet, of course, “bring your own device” creates issues in day-to-day systems’ management as it requires an IT department capable of working with multiple systems and devices.
USB ports’ usage represents another interesting aspect of compliance policy. The nature of academic work, for example, implies that I give invited talks in many different organisations throughout the year. Even when those organisations position themselves as tough on cybersecurity, very often they ask external speakers to bring presentations on USB sticks. This simple example is extremely characteristic of a major gap in the compliance policies of many organisations: even though it is clear that a major cybersecurity threat could be delivered to the system via an infected USB stick, employees are often left in charge of their own USB ports as well as USB ports of network computers placed in various parts of their office spaces. However, internal policies with regard to passwords often supersede all other policies in terms of precision and severity. And yet, these policies often lead to unintended consequences. If you recall the passwords we used for various online accounts in the 1990s, you will remember that there were no particular restrictions on them. They could have consisted of letters only, or numbers only, and their length was not regulated. Now think of your email password at work. It probably has to contain no fewer than eight characters, with both capital and small letters, and at least one number and one special character. Considering that the system also prohibits use of your name or common word phrases, all this makes it very difficult for any normal human being to remember the password. Another major problem for many of us is that we have to change our passwords regularly to keep accessing the systems. The outcome - people write down passwords, store them in insecure places and delay changing their passwords until they are locked out of the company systems. Obviously, all of this is costly to organisations.
Carrot or Stick?
Interestingly, when selecting between carrot or stick measures when it comes to cybersecurity, business tend to use stick a lot more frequently than carrot, although several interesting carrot solutions have recently emerged (e.g., cybersecurity training gamification, recognition and rewards, etc.)
In many large organisations, IT departments exercise the practice of creating benign bogus attacks (most often phishing attacks) and trial these attacks on their staff. There are also rather severe punishments in place for those who fail to spot the threat or attack, to the extent that those members of staff who do not perform well in the test lose some functionality of their computers. For example, in my research I interviewed a PA who worked at a large company and whose job was to schedule meetings, prepare documents, and liaise externally and internally on behalf of her manager. This PA failed to spot a fake phishing attack two times in a row and, as a result, faced internal IT sanctions: she lost the capacity to send any emails for two weeks. Can you imagine the effect these sanctions had on this person’s productivity (as she had to fully rely on telephone communication to fulfil her duties) and, more importantly, on her psychological ability to treat her organisation as a trusted entity? I also interviewed a staff member in a large corporation who revealed that monetary fines are imposed for doing poorly in cybersecurity “surprise” tests in his organisation and that once he paid out over a third of his monthly salary in related penalties.
All these examples illustrate an important trend: many companies seem to be using negative reinforcement mechanisms in order to incentivise their staff to pay attention to potential threats. Yet, very often this leads to negative results: people become overly suspicious, panicking over simple emails, and fail to constructively address potential problems because they do not know how they will be viewed by the management. Such measures also tend to create an erosion of trust. Many corporate employees I have talked to about cybersecurity issues told me that they were unlikely to inform their IT department of a potential problem (e.g., if they clicked on a link in a malicious email, for example) simply because they thought this might have a negative impact on their promotion opportunities, salary, reputation, etc. Obviously, by introducing such measures, organisations often underestimate the potential consequences of negative reinforcement as their goal is to subject their staff to potential threats and help them gain experience in recognising risks. Yet, it appears that instead of gaining experience, employees tend to become anxious about cyber issues when negative reinforcement is used. It would seem that positive reinforcement (bonuses, praise, etc.) would work a lot better than punishment where staff training is concerned as it would allow organisations to encourage staff to participate in cybersecurity tasks by avoiding unnecessary blame and negative feelings.
What Actually Works?
When we look at cybersecurity policies and the most successful organisations in motivating staff, we often see that they have one thing in common: they design Key Performance Indicators (KPIs) with cyber security in mind. Importantly, often they find a balance between positive and negative reinforcement. For example, in a series of recent interviews, one company told me that they set KPIs for team managers, which incorporate spotting cybersecurity threats. As a result, this company created a unique culture, where teams compete against each other in threat-spotting and every month the winner of the competition is announced at a managerial meeting. Importantly, it is also revealed which teams came at the bottom of the list to motivate organisational units who are not so good at cyber security risk prevention to engage with this space.
When designing cyber security policies for the employees, businesses need to find a balance between the carrot and the stick. It seems that creating effective and successful cyber security culture requires designing correct KPIs, which would motivate employees and managers to "do the right thing".