Imagine your football team has just narrowly lost a game. Who’s responsible for the defeat? Is it the goalkeeper who let the ball slip through their fingers in the first half, or the striker who missed a sitter in the closing minutes? Maybe it’s the manager’s fault for failing to devise and implement a successful gameplan?
Now take this analogy and apply it to a business trying to assign blame in the aftermath of a cyber attack. Does the blame lie with the IT department for failing to put effective cyber defences in place, or is it the fault of the CEO for not implementing a culture of cyber awareness? Perhaps the employee who clicked the link that contained malicious software should take responsibility?
Many businesses opt for the latter choice. Research from security company Tessian found that 21% of the 2,000 US and UK workers they surveyed have lost their job in the past year after making a mistake that compromised their company’s security.
Impact of remote working
Irina Brass is associate professor in regulation, innovation and public policy at University College London. She says the figures show “a knee-jerk reaction. There’s a lot more organisations can do to become more resilient before placing the blame on employees.”
One option is a refreshed cybersecurity training programme that reflects post-pandemic working patterns. While many businesses provide such training to their employees, often these overlook the new vulnerabilities exposed by the technologies that facilitate widespread remote working.
The cloud is one example. Nearly four out of ten businesses have accelerated their migration to cloud technologies during the pandemic, according to McKinsey, with 86% expecting this acceleration to persist post-pandemic.
But as Brass points out, the cloud creates more routes for cybercriminals to hack a business, undermining perceptions of the technology as a completely secure option.
“It makes the attack surface larger and more homogenous because you have these cloud-based work packages that are the same being deployed to large numbers of people, meaning that once a hacker figures out a particular compromise, they can apply it to all sorts of replicas.”
With more applications and tools being stored in the cloud, more people require access to it. This means the amount of data the average employee can access has grown exponentially in a short period of time.
Cybercriminals are exploiting this. They’re using the primary benefit of the cloud – its ability to connect workers to essential company documents regardless of their location – to access large amounts of data through a single breach.
Dangers of consumer devices
This is part of a broader set of challenges, stemming from the fact that our home environments are fundamentally less secure than offices.
The immediate shift to home working exposed our work laptops – and businesses’ data – to an array of consumer-connected Internet of Things (IoT) devices. Smart products in the home – from light switches to speakers – experience an estimated 12,000 hacking attempts each week, according to Which?. Smaller, cheaper products often lack many of the security features of traditional computers, making them easier for cybercriminals to hack.
The threat posed by lax security systems for some IoT devices would ordinarily be isolated to consumer data. But with widespread remote working, these devices now act as a gateway for hackers looking to access a company’s data.
“Most consumer devices have dubious security specifications,” Brass says. “They have default passwords and really short software update periods, if at all. And if a hacker compromises them, they scan a work device that is on the home network for vulnerabilities and an employee won’t even be aware it’s happening.”
The correlation between the shift to remote working and rising cyber attacks suggests it’s the unique working environment caused by the pandemic, rather than employees themselves, that’s driving the spike in breaches. The solution may be further training for employees on the importance of cyber hygiene both in and out of working hours.
Legal questions
The legality of dismissing an employee after they make a cybersecurity mistake also warrants consideration. According to Monica Atwal, managing partner and employment law specialist at Clarkslegal, an employer’s reasoning for dismissing an employee usually falls into two categories: gross negligence or gross misconduct.
These reasons require an employer to prove that on the balance of probabilities the employee is either culpable of serious carelessness or they engaged in a clear and serious violation of the company’s rules.
The lack of regular training offered to employees on cybersecurity undermines the validity of these reasons, Atwal adds. A study from Software Advice earlier this year found that 44% of SMEs have not trained their team on cybersecurity since 2020, despite 62% of them experiencing an increase in cyber attacks in the same period.
The absence of a “systematic approach to cybersecurity” weakens an employer’s argument for dismissal by gross negligence, in Atwal’s opinion, because of the need to demonstrate that an employee received “intensive training” on a “regular basis” and still acted carelessly.
“An employee would clearly have an unfair dismissal claim and you would get short, sharp shrift from an employment judge if you said you received one training session on something that is so complicated and nuanced,” she says. “It’s like being accused of stealing when you don’t even know you’ve taken something.”
Pinning the blame
Despite the risk of being sued for unfair dismissal, Tessian’s research shows that many employers believe employees should shoulder the blame for any cyber incidents that happen on their watch.
Pointing to the fact that the same research found that 29% of businesses have lost a client because of a cyber mistake in the last year, Jeff Hancock, founding director of the Stanford social media lab, notes many employers are trying to “pin the blame” by dismissing employees after a breach.
“Businesses want to provide a reason for why it happened to their clients,” Hancock adds, “but this comes at a long-term cost because employees are going to be less likely to report attacks in the future.”
A policy of dismissing any employee who makes a cyber mistake risks instilling a culture of fear around reporting such incidents. In time this would leave a business more vulnerable to hackers, as employees become unwilling to report any breaches or vulnerabilities they’ve noticed in the company’s cyber defences.
The solution, in Hancock’s opinion, is a company culture where cybersecurity is at the forefront of every employee’s mind, regardless of their position. This would involve regular training sessions on the latest hacks cybercriminals are using. It would be underpinned by an understanding between employers and employers that cyber breaches are ultimately inevitable and not the responsibility of any one person.
In a similar way to a football team dealing with a loss, a breach is rarely the fault of a single individual. Good cybersecurity requires input from every employee at a business, whether they’re the CEO or an intern.
How human psychology causes cyber attacks
Many high-profile cybersecurity incidents paint a misleading picture of the type of attack businesses should expect. Most breaches don’t result from a hacker circumventing an organisation’s cyber defences. Instead, cybercriminals are increasingly incorporating social engineering techniques into their scams, relying on psychological manipulation, rather than technology, for success.
Phishing emails are one example of a social engineering scam. These employ a wide range of psychological manipulation techniques to fool the recipient of the email to open a link or attachment that contains malicious software.
Some prey on peoples’ fears, anxieties, or emotions, causing them to lower their defences and let a hacker into their system. Others invoke a sense of scarcity or urgency to goad a victim into acting quickly without thinking.
Jeff Hancock, founding director of the Stanford Social Media Lab, regards cybercriminals as “good psychologists”, given the wide range of manipulation techniques they use. However, as Hancock points out, there are cognitive vulnerabilities that are unique to the workplace, making businesses particularly vulnerable to these types of scams.
“With businesses, the hackers will know about social relationships. You can easily see who someone’s boss is, and because many people are deferential to authority, this creates a good attack for hackers looking to get employees to share confidential information.”
Widespread home working has exacerbated this issue, with many employees losing the face-to-face time with their managers that’s essential for trust building. Cybercriminals exploit this by creating scams that prey on an employee’s desire to impress senior team members and the vulnerabilities unearthed by isolation.
Often these scams lead the victim into a decision-making process that is quick, complex, and vulnerable to emotional persuasion. This combination is highly effective when the victim is unable to speak to their colleagues and get a second opinion on a suspicious-looking email.
Such vulnerabilities add to the perception that staff are often the weakest link in an organisation’s defence against cybercriminals. However, the vulnerabilities posed by human psychology in cyber attacks are rarely given the same attention as the technological threats from hackers in cybersecurity training.
Good cybersecurity is about more than technology. With social engineering scams on the rise, businesses need to create a training programme that informs employees both what cyber attacks look like and the thinking that underpins them.
Are employees careless or ignorant about cybersecurity?
When an employee enables a breach, employers may need to determine whether it came about through carelessness or ignorance about the threat posed by hackers. The answer could guide any disciplinary action.
However, reaching a concrete answer is a complex process, particularly given the number of cyber attacks created on a daily basis. According to the AV-Test Institute, approximately 450,000 new pieces of malware are detected every day. Hackers send out roughly 3.4 billion phishing emails daily to potential victims.
The range and frequency of attacks on businesses complicates the training process for employees. Team members must be constantly vigilant about a variety of threats that prey on both the technological vulnerabilities of the business and the psychological vulnerabilities of its staff.
Despite the increase in attacks by hackers, Julien Soriano, chief information security officer at Box, believes employees must be aware of the data they have access to.
“You cannot separate ignorance from carelessness in the wake of a cyber attack. Ignorance is carelessness,” he says. “It is the employees’ responsibility to do the right thing and comply with their employer’s policy and understand their role and responsibility in keeping their access safe.”
The lockdown-driven surge in remote working dramatically increased the amount of data employees can access. Virtually overnight, businesses shifted to a work from home system, meaning their data was placed into a more vulnerable environment at the same time many employees had greater access to critical documents and information than ever before. This increased both the likelihood and repercussions of a cyber attack, meaning employees need to be hyper-aware of the threat posed by hackers.
However, with remote working here to stay, the only careless or ignorant actor in the aftermath of a cyber incident is the employer who fails to protect their data, says Alex Rice, co-founder and CTO at HackerOne.
“Inevitable human error is never a satisfactory explanation for a cybersecurity incident. If a human simply clicking a single link caused a breach of your company, your security practices are to blame, not the human,” he says.
“If a company acts to its best ability to reduce cyber risk, it is not anyone’s ‘fault’ beyond the cybercriminals that chose to commit the crime. We need to get out of this toxic blame cycle that discourages transparency and continuous improvement.”