- Critical warning from Microsoft: .NET install domains changing
- Why I recommend this Windows tablet for work travel over the iPad and Lenovo Yoga
- I tested the new Kindle Paperwhite, and it has the one upgrade I've been waiting for
- If you're a Ring user, I highly recommend this video doorbell that's easy to install
- This tablet solved my biggest problem as a smart home enthusiast
Through Your Mind’s Eye: What Biases Are Impacting Your Security Posture? | McAfee Blogs
Cybersecurity and biases are not topics typically discussed together. However, we all have biases that shape who we are and, as a result, impact our decisions in and out of security. Adversaries understand humans have these weaknesses and try to exploit them. What can you do to remove biases as much as possible and improve your cybersecurity posture across all levels of your organization?
Cybersecurity personnel have many things to address and decisions to make every day — from what alerts to investigate, to what systems to patch for the latest vulnerabilities, to what to tell the board of directors. However, our brains don’t give each decision equal attention—we take mental shortcuts. These mental shortcuts are known as biases and they allow us to react quickly.
In this two-part blog series, we’ll explore the types of cognitive biases that could be affecting your company’s security posture and give you tips on how to address these biases.
Part One: Types of Cognitive Biases
Do you feel you are biased? We all are to some extent. What do you see when you look at this picture below? Faces or a vase? Some people may see one or the other and some see both. This is representative of what happens in real life. Many of us are at the same meeting together but leave with different perspectives about the discussion. This is our cognitive biases influencing us.
A cognitive bias is a result of our brain’s attempt to simplify processing of information. The formal definition says it is “a systematic pattern of deviation from norms in judgment”. We as individuals create our own “subjective reality” from the perception of the inputs. Our construction of reality, not the input, may dictate how we behave.
Availability Bias
Availability bias is a mental shortcut that our brains use based on past examples relating to information that is “available” to us around a specific topic, event, or decision. This information could come from things we saw on the news, heard from a friend, read, or experienced. When we hear information frequently, we can recall it quickly, and our brains feel it is important as a result. With all the urgent interrupts and overall volume of decisions needing to be made by CISOs and other cyber executives, it is very easy to get caught up in decision making based on past or recent information.
Availability bias impacts security in many ways. We often see the impact in the areas of risk assessment, preparedness, decision making and incident response. In the area of risk assessment, availability bias may arise when the company board of directors looks for an updated risk assessment. Rather than focusing on the entire company, data could be presented with respect to an area for which another company had a breach. For example, we have seen SolarWinds in the news a lot throughout the first quarter of this year, and our inclination might be to assess our risk in the context of that incident. However, the assessments should look at all aspects of the business in depth and not just focus on the supply chain risks. Are there issues that require more attention than what is trending in the news?
We also see availability bias in preparedness when organizations prepare for high impact, low probability events instead of preparing for high probability events. What we should worry about doesn’t always align with what we do worry about. Events that have a high impact but low probability of occurring, such as an airplane crash, a shark attack, or volcano eruption, often receive much attention but are less likely to occur. We remember these much more than we remember higher probability events like falling off a ladder or automobile accidents. For instance, can you name the last phishing campaign you heard about or the last time someone’s PII was stolen? Probably not, but these are examples of the high probability events your organization most likely needs to prepare for.
In the area of decision making, your CISO or the cybersecurity analysts may make decisions in favor of hot topics in the news. These topics may overshadow other information they know or is so mundane that it becomes background noise. As a result, decisions made are not well rounded. For example, if there was a recent IoT related issue like Dyn in 2016, your analysts may over focus on IoT related security decisions and neglect things like investing in new security controls for your mobile devices.
Availability bias also surfaces during critical incidents when emotions are typically running high and the focus is on quickly addressing the issue at hand. Focusing on securing the specific area where the incident occurred may leave us blind to another issue waiting in the wings. Let’s pretend someone broke into your home through a window, your first thought may be to secure all the windows quickly; however, if you didn’t look at all your security risks, you may forget that you can shake your garage door lock, and it’ll pop open.
Our analysts are typically exploring data thoroughly though executives may not always see the in-depth information. If you are at the executive level, I would recommend you review all the facts and consciously look beyond what is available quickly so you get the full picture of the incident, how prepared you are, risks, etc. If you are an analyst or in a position of influence, I would recommend summarizing the facts in way that accurately reflects the probability of those events occurring as well as considering all possible events.
Confirmation Bias
Another bias that appears in cybersecurity is confirmation bias. This is when you look for things to “confirm” your own beliefs or you remember things that only conform to your beliefs (similar to availability bias). For example, your news feed may be full of things related to your political beliefs based on what articles you clicked on, shared, or liked. Chances are it’s not filled with things that oppose your beliefs. A few areas where confirmation bias is seen in cybersecurity is in decision making, security hygiene, risk assessment, preparedness, and penetration testing.
When you are making decisions, are you considering different points of view or just looking to your close group of trusted advisors who may think like you? Are you willing to push and challenge your own beliefs to ensure you are making the best decisions for the company?
When was the last time you reviewed your company’s security hygiene? Are you diligent about updating systems or do you believe it won’t happen to you because nothing has happened in the past? Are you using an XDR solution in your environment or do you feel you don’t need it because all your current systems are serving your needs just fine? Do you feel you are more secure when you are in the cloud vs on-prem despite human error affecting both?
How do you approach cybersecurity preparedness? Are you passive, reactive, or progressive? Similar to hygiene, do you feel an incident won’t happen to you so you look for data to confirm that? Or are you the opposite and feel you may repeat incidents if you don’t do everything possible to look for data to confirm those beliefs? If you are an executive, are you reviewing the facts and evidence for all your cyber processes or just those that you personally know well from early on in your career? I’ve seen some analysts ignore some of their alerts because they weren’t quite sure how to deal with them. As a result, they fall back on what they know or information that is readily available.
Sometimes organizations may hire third party companies or employ penetration testing performed on their environments. When you define the scope of work, are you looking for all the gaps or holes or just focusing on the weaknesses and strengths? When the results come in, do you address everything that is recommended or only focus on the items you believe will impact you?
It is hard to look beyond what we believe because in our eyes it is ground truth. It is important in making security decisions that we look beyond what we want to hear or see to ensure we are getting what we need to hear and see.
Unconscious or Implicit Bias
Unconscious or implicit biases are social stereotypes about certain groups of people that we form outside of our own conscious awareness. Just as you see in the picture, our mind is like an iceberg where the conscious mind is what we can recall quickly and are aware of. The subconscious mind stores our beliefs, previous experiences, memories, etc. When you have an idea, emotion, or memory from the past, it’s recalled from our subconscious by our conscious mind. The third layer – our unconscious mind – is deep inside our brain.
Everyone holds unconscious beliefs about various social groups, and these biases stem from our tendency to organize social worlds by categorizing them quickly. We often think about unconscious bias in the context of negative biases, but there are also positive unconscious biases, for example feeling a connection to someone from your hometown or college alma mater. Unconscious bias impacts security in the areas of decision making, risk assessment, incident response, cyber security policies and procedures, and identity and access management.
In the area of decision making, I’ve seen executives blindly trust the IT team because they are perceived as being the “experts”. While this may be true, they are wrestling with the same unconscious biases and skills shortages many of us are. Just as it’s important to seek out additional information and facts when making your own decisions, it’s equally important to review the data and provide feedback and alternate opinions to others. Often, it’s easy to go with the majority and not rock the boat. If you feel that something needs to change or be addressed differently, don’t be afraid to go against the flow. Mark Twain is quoted, “Whenever you find yourself on the side of the majority, it’s time to pause and reflect.” When was the last time you went against the majority?
Another unconscious bias that sometimes arises is related to age. Some people feel older workers are a greater risk to a company than younger workers because they perceive older workers as not being “up to date” on newer technologies. Conversely, some feel younger people engage in risky behavior like visiting potentially suspect websites or sharing too much information on social media. As a result, security analysts may focus on the wrong areas as the source of a security risk or issue based on their biases.
If you had an incident, how would you respond? Would you blame an unsecure IT environment, incompetent end users or would you look at the facts and evidence in and outside of your beliefs to determine what happened? How would you and your team respond to the incident? If your security operations team felt that IT had not done their part prior to the incident, you may be looking in the wrong area for the source of the incident. You may have heard the acronym PEBKAC. For those that don’t know what it means, it stands for “problem exists between the keyboard and chair”. Are you sure the problem is PEBKAC or does it lie somewhere in your environment?
Implicit trust is another form of unconscious bias. When was the last time your cybersecurity policies and procedures were thoroughly reviewed? Let’s say you feel your SOC analyst is amazing, and you trust everything they say. Because of this implicit trust, you don’t think to dive into the details. As a result, you could have a firewall running without any defined rules but wouldn’t know because you’ve never checked. This doesn’t mean your SOC analyst isn’t trustworthy, just that you shouldn’t allow your unconscious bias to overrule the necessary checks and balances.
We can sometimes also be led to overconfidence by unconscious bias. For example, when writing a paper or an article, we can be certain that there are no mistakes or typos, but often it’s because we’ve read or reviewed it so many times that our unconscious mind reads it as it should be and not what it actually is. Similarly, in the area of identity and access management, security analysts and software developers may blame users for issues and fail to look at the internal infrastructure or their own code because they have a false confidence that leads them to believe they couldn’t possibly be the problem.
To overcome unconscious and implicit bias, ensure you are sticking to the facts and asking all stakeholders, including those you may disagree with for inputs. Also look in the mirror. Did you make a mistake or are you excusing your behavior instead of facing it? Also, don’t be afraid to follow the words of Mark Twain and pause and reflect to ensure you are making the correct decision, addressing the incident in the correct way, or hiring the right person.
Because we all have biases and take mental shortcuts, we need to make a conscious effort to address them. Look beyond what you want to hear or see and what shows up in your news feeds to address availability and confirmation bias. Ensure you are sticking to the facts and asking all stakeholders, including those you disagree with, for inputs to overcome unconscious and implicit bias. You don’t want to be the next company in the news because your biases got in the way.