How to address security fatigue and stop cybercriminals from winning
Security fatigue and negative social influences give digital bad guys the advantage. Experts offer tips on how to solve this cybersecurity issue.
You may be familiar with the tech saying, “It’s not if, but when.” That saying is no longer apropos. It has been replaced with “It’s not when, but how often.” One reason for lost optimism is the uncanny ability of cybercriminals to manipulate their victims.
Troy La Huis and Michael Salihoglu discuss the issue and possible solutions in their Cybersecurity Watch (Crowe, LLP) blog post, The Psychology of Cybersecurity.
“The unfortunate paradox is that the internet is the home and gateway to a vast abundance of cyberthreats, yet it seems impossible to try and run a business without it,” write La Huis and Salihoglu. “The seemingly endless ocean of threats can paralyze those who make decisions for an organization. They sense an ominous feeling of blood in the water yet lack clarity about how to stop the sharks from feeding.”
SEE: Social engineering: A cheat sheet for business professionals (free PDF) (TechRepublic)
What leads to security fatigue?
Not unlike the fatigue being experienced due to COVID-19, users are flooded continuously with news about this or that data breach and how their sensitive information was stolen. La Huis and Salihoglu express growing concern that users are starting to ignore warnings altogether because there is nothing they can do to change the outcome. That is when security fatigue enters the picture.
A National Institute for Standards and Technology (NIST) study describes security fatigue “as a weariness or reluctance to deal with computer security.” One of the study’s research subjects mentioned, “I don’t pay any attention to those things anymore… People get weary from being bombarded by ‘watch out for this or watch out for that.'”
What’s the solution to security fatigue?
There is some thought that security fatigue can be assuaged by reversing user apathy. As to how that can be accomplished, Jason I. Hong, co-founder of Wombat Security Technologies, offers his thoughts in the Alexandra Michel’s Association for Psychological Science article, Psyber Security: Thwarting Hackers with Behavioral Science.
Hong explains, “The ‘light bulb’ moment for me happened one day at my startup. Two women were talking to each other about a recent event. One said, ‘Did you hear what happened to Moe? He slipped on the ice [and dropped his laptop], and now can’t access the files on it.’ The other woman said, ‘I’m going to back up my data right now.’ And she did!”
That moment is an example of a positive social influence. Hong continues, “I had heard my colleagues in the behavioral sciences talk about concepts like social proof, commitment, and reciprocity for years, and it all crystallized in my head based on this one event, that we could also use these kinds of techniques to solve hard problems in cybersecurity.”
Hong’s example is notable in that it describes the psychological concept called social proof. In her article Social Proof: What It Is, Why It Works, and How to Use It, Shanelle Mullin writes, “Social proof is based on the idea of normative social influence, which states that people will conform in order to be liked by, similar to, or accepted by the influencer (or society).”
This is why Michel says, “One of the biggest challenges in convincing people to adopt safer cybersecurity practices is that people simply don’t have much opportunity to observe each other’s behavior.”
Michel explains further, stating, “Because our internet- and computer-based lives are so private, it’s harder for us to know which practices are common and what the secure and accepted norms are.”
SEE: Security Awareness and Training policy (TechRepublic Premium)
Formulate a positive social influence
It is interesting that technology alone does not appear to be the answer; however, user education and building a security culture is likely the way to go. “By breaking down the threats, targets, and actions, cybersecurity specialists can help people understand their individual roles and the cybersecurity risks involved in their jobs and interactions with others,” argue La Huis and Salihoglu. “They can give people the tools to identify likely threat scenarios, how to detect them, and how to respond.”
Build a security culture
To move back to at least “It’s not if, but when” means taking a hint from cybercriminals who are sharing what works and what doesn’t, and sharing is exactly what Mullins stressed in her article.
In conclusion, here is what La Huis and Salihoglu suggest might help reduce the frequency of security issues:
“The ability to improve cybersecurity posture and avoid the slow decay of concern seems to lie in making cybersecurity a digestible, positive experience. If nothing else, individuals and organizations should consider how they think about cybersecurity and about how those thoughts translate into their everyday actions.”