- Perimeter Security Is at the Forefront of Industry 4.0 Revolution
- Black Friday sales just slashed the Apple Watch SE (2nd Gen) to its lowest price ever
- Get an Apple Watch Series 10 for $70 off for the first time ahead of Black Friday
- The 15 best Black Friday Target deals 2024
- This fantastic 2-in-1 laptop I tested is highly recommended for office workers (and it's on sale)
Social engineering: How psychology and employees can be part of the solution
Social engineering is allowing cybercriminals the way in. Learn how to lock that door.
There is something those responsible for a company’s cybersecurity must come to terms with: Technology alone is not going to get the job done. It’s time to shift some of the focus from technology to psychology, as even the most sophisticated cybersecurity system has not prevented people from falling victim to social engineering.
You may or may not be familiar with Kevin Mitnick–he enlightened the world to the power of social engineering. In his 2002 book, The Art of Deception, Mitnick writes that he never used software programs or hacking tools for cracking passwords or otherwise exploiting computer or phone security. Instead, he compromised computers solely by using passwords and codes that he gained through social engineering.
Today, Mitnick, chief hacking officer at KnowBe4, and Stu Sjouwerman, founder of KnowBe4, are focused on educating us so we are no longer the lowest-hanging fruit on the cyberattack tree. Currently, the two are concerned about a specific type of social engineering–phishing. According to Verizon’s 2020 Data Breach Investigations Report, phishing is how a vast majority of data breaches start.
SEE: Social engineering: A cheat sheet for business professionals (free PDF) (TechRepublic)
“While companies tend to focus their cybersecurity efforts on technical defenses, the truth is that most cyberattacks take advantage of human fallibility,” writes Sjouwerman in a (sponsored) Corporate Compliance Insights article, The Psychology of Phishing Victims and How to Overcome it. “Successful cybercriminals take advantage of people, exploiting their usual pattern of thinking to gain access to personal information and accounts.”
What is social engineering, and why do we fall for it?
From Wikipedia, here’s an excerpt of the definition of social engineering: “In the context of information security, social engineering is the psychological manipulation of people into performing actions or divulging confidential information. A type of confidence trick for the purpose of information gathering, fraud, or system access, it differs from a traditional ‘con’ in that it is often one of many steps in a more complex fraud scheme.”
Some of the popular social engineering techniques among cybercriminals, due to their effectiveness, are phishing, vishing, pretexting, quid pro quo, baiting, and water holing. As to why we tend to get conned by these techniques, Georgia Crossland, a PhD researcher at the Centre for Doctoral Training in Cyber Security, Royal Holloway, University of London, in the Infosecurity Magazine article Biases in Perceptions of Information Security Threats, suggests optimism bias and fatalistic thinking are the two leading reasons. Crossland defines each as:
- Optimism bias refers to the phenomenon whereby individuals believe they are less likely than others to experience an adverse event.
- Fatalistic thinking refers to an outlook where individuals may believe they have no power to influence risks personally–for example, everything can be co-opted, so there’s little point in even trying to protect.
Cybercriminals use optimism bias and fatalistic thinking to get victims to reply to bogus emails, volunteer personal information, click on malware links, and open attachments containing malware.
Instead of the problem, people can be the solution
Crossland believes not all is lost. She contends, “Instead of seeing the human as the ‘problem,’ empowering employees to be part of the solution to information security issues might be the way forward.”
“It’s vital to have a mandatory security-awareness training program in place for your employees, but there’s no reason it can’t be inclusive and fun,” says Sjouwerman. “Cybersecurity should be the responsibility of every employee, not an abstracted function of InfoSec professionals or IT departments. If you get everyone to buy in, you can create a culture that dramatically reduces cybersecurity risks.”
SEE: Security Awareness and Training policy (TechRepublic Premium)
Sjouwerman then refers to a recent MIT Sloan Management Review article, The Unaddressed Gap in Cybersecurity: Human Performance. The authors Stephen A. Wilson, Dean Hamilton, and Scott Stallbaum believe that embedding new behaviors and shared understanding as part of the culture and normal course of business is the best defense against cyberattacks.
Wilson, Hamilton, and Stallbaum add, “Fortunately, an analog exists for addressing this type of risk and leveraging human performance as a critical layer of defense: the High-Reliability Organization (HRO), which we define as an organization that has a remarkably low number of mishaps consistently over a sustained period of time yet performs highly complex and inherently hazardous tasks.”
The authors of the MIT Sloan article determined that HROs exhibit the following characteristics:
- Mindfulness: HROs exhibit chronic unease–a state of hypervigilance and watchfulness for early danger signals.
- Responsiveness: HROs identify emerging issues early and respond quickly to arrest the development of the incident.
- Learning capacity: HROs learn from every event and quickly disseminate knowledge to improve the system rapidly.
Final thoughts
We humans have behavioral characteristics that cyber bad guys prey on; based on that MIT Sloan article, those behaviors can be changed, or we can be made aware of them. It’s not easy to change, but it’s entirely possible.