Does Protection Help As Much As We Think In Security? – Cisco Blogs


I love it when data surprises me.

In cybersecurity, we’re good at researching how things can go wrong, but it’s harder to figure out when things are going right. Most of our prescriptive advice starts to sound obvious after all these years: least privilege. Patch all the things. Segmentation. Redundancy. Resilience. And always, always, encryption. But which practices actually lead to a successful security program?

This year we decided to take a new strategy, with the help of the Cyentia Institute, founded by some of the data scientists who created the Verizon Data Breach Investigations Report. We tried to determine which security practices, or factors, appeared to correlate most with the desired outcomes of a security program. The resulting report, dubbed the Cisco Security Outcomes Study, will be released on December 1 – sign up to be the first to know when it comes out.

In the meantime, here’s a little amuse-bouche to whet your appetite, one of the many findings that we didn’t have room for in the main report. It has to do with the NIST Cybersecurity Framework, that guiding beacon that helps us describe the many security practices that make up a program. Those practices are broken down into the functions Identify, Protect, Detect, Respond, and Recover, and we sorted the practices in this year’s survey the same way. Wade Baker, partner and co-founder of Cyentia, has more to say specifically about the research and its results below.

We asked respondents about where their security programs place the greatest priority in terms of investment, resources, and effort. We used the high-level security functions defined in NIST Cybersecurity Framework for this. Respondents rated their level of priority for each function.

The figure below lists NIST functions from top to bottom based on their strength of correlation with the respondent claiming to have a highly successful security program. The bars and values indicate the expected increase in probability of overall program success associated with firms placing higher priority on each function. Because of statistical variation, that increase is expressed as a range of probability. The middle value marks the average (and most likely) increase in the likelihood of program success.

NIST security functions most strongly correlated with overall security program success

The surprising thing about this chart is that it’s telling us that the Identify, Detect, and Respond functions contribute to a successful security program more than Protect. We don’t see this as “protection isn’t important” (it contributes to many outcomes, per the next figure), but rather that the best programs invest in a well-rounded set of defenses to identify, protect, detect, respond, and recover from cyber threats. The cybersecurity field has long been protection-heavy; this says that protection alone is not the most effective strategy.

NOTE: There’s another factor at play here and that’s statistics. Most organizations place high priority on Protect, and the success of those organizations varies substantially. Because so many firms are Protect-heavy, the statistical analysis is less likely to see it as a key differentiator.

The number of outcomes that correlate (strongly) with identification is unexpected. It speaks to the importance of this security cornerstone – know what you’re protecting and protecting against. The next two, Detect and Respond, seem to echo the common “when, not if” mantra for security incidents. Because it’s impossible to prevent every threat, the distinguishing mark of a successful program is detecting incidents when they do occur and responding effectively.

NIST high-level functions correlated with each security program outcome

The value and shading at the intersection of each function and outcome can be interpreted in this way: “Function X correlates with a Y% increase in firms reporting they’re successfully achieving outcome Z.”

Here we see that Identify isn’t equally effective for all outcomes, but it’s usually the strongest. But there is some variation here, so pay attention to the patterns (e.g., Respond seems particularly relevant to ‘Gaining executive confidence’).

So what’s going on here? My own opinion about why Protect doesn’t seem to be a power player echoes Wade’s warning about statistical effects. Everyone does a lot of Protect; in fact, that’s one of the first functions that developed as dedicated security products, so if you buy something for your organization (and who doesn’t?), it’s most likely to be firewalls, antivirus, encryption, MFA, and similar controls. It doesn’t mean that “protection is dead,” or that “protection eventually fails.” It means that because of the way we tried to research this question of practices correlating with outcomes, it looks from the data as if even those organizations that said they weren’t doing as well as they’d like were getting something out of that function. If everyone’s doing it, it’s not special.

Why is Identify the one to correlate most strongly with more outcomes? I believe that for one thing, it’s a prerequisite for the other functions: if you don’t know you have it, you can’t protect it, detect attacks on it, respond to those attacks, and recover from them. Just because it’s a basic, it doesn’t mean it’s easy to do, however, and when organizations report that they’re prioritizing Identify as a function, the added capability there may influence their success overall. Identify doesn’t just mean locating all your assets one time; in a dynamic environment, you have to keep up with changes — and isn’t that pretty much the same as Detection? I suspect that prioritizing the Identify function builds up security muscles that bring added value to all the other practices in a program.

But don’t take my word for it; let’s look at what someone else thinks. Sounil Yu, former chief security scientist at Bank of America and now CISO-in-Residence at YL Ventures, has thought a lot about the NIST CSF functions and how their use has changed over time. In his Cyber Defense Matrix work, he describes not only how the proportions of technology, people, and process change between functions, but how the emphasis on different functions changes as an organization’s security strategy shifts from a risk-averse posture to one that is more comfortable with business risk.

A risk-taking strategy involves putting less emphasis on Protect, as those types of security controls may be more likely to constrain business and technology functions. If you build up visibility along with your Detect and Respond capabilities, you can take advantage of more speed and agility. This might explain why the respondents in our survey who claimed more successful outcomes in their security program also reported prioritizing more than just Protect.

We could spend a lot of time speculating about why we got these results, or we could simply grab a shovel and dig deeper. If this is a side effect of the statistical analysis we used, we could try different methods in a future survey; a lot rests on how you ask a particular question and how you compare the answers. If it really means that you can protect all you want, but if you really want to succeed at cybersecurity, you need to pay equal attention to the other functions, then we can learn more so that we can make specific recommendations.

I hope you’ve enjoyed this early look at some intriguing data, and stay tuned for the main report, coming out on December 1! In the meantime, here is a good conversation with Wade Baker, Jason Wright from ThreatWiseTV and myself as we discussed this topic.

Sign up and be the first to know when the Security Outcomes Study comes out and join us on December 1 for our live broadcast: Proven Factors for your Security Program, as we discuss the most surprising findings of the Security Outcomes Study.

Share:



Source link