- Perimeter Security Is at the Forefront of Industry 4.0 Revolution
- Black Friday sales just slashed the Apple Watch SE (2nd Gen) to its lowest price ever
- Get an Apple Watch Series 10 for $70 off for the first time ahead of Black Friday
- The 15 best Black Friday Target deals 2024
- This fantastic 2-in-1 laptop I tested is highly recommended for office workers (and it's on sale)
Industrial Cyber Security and the Florida water supply attack
Through the lens of the Florida water supply hack, Dale Peterson teaches how events like these remind us to take the necessary steps to maintain our cybersecurity. Founder and chair of S4 Events, Dale has been helping security professionals effectively and efficiently manage risk to their critical assets for over 15 years.
Spotify: https://open.spotify.com/show/5UDKiGLlzxhiGnd6FtvEnm
Stitcher: https://www.stitcher.com/podcast/the-tripwire-cybersecurity-podcast
RSS: https://tripwire.libsyn.com/rss
YouTube: https://www.youtube.com/playlist?list=PLgTfY3TXF9YKE9pUKp57pGSTaapTLpvC3
Tim Erlin: Welcome everyone to the Tripwire Cybersecurity Podcast. I am Tim Erlin, vice president of product management and strategy at Tripwire. Today I am joined by Dale Peterson, who is the founder and chair of S4 Events and the CEO of Digital Bond. He has been working with cybersecurity for industrial environments for many, many years.
Dale Peterson: Yeah, I stumbled into it in 2000. Actually, it did my first water SCADA assessment back then when I had no idea what SCADA was, and I’ve been learning and enjoying it since then.
What Happened in Oldsmar
TE: Recently, there was a very high-profile public disclosure of an attack on a water treatment plant in Florida. What do we know about what happened at that water treatment plant?
DP: It was a very small event that created a large ruckus because it was a small municipal water utility in Oldsmar, which is in Florida. It services about 15,000 people.
You could almost think of it as being not protected in any way. It’s almost as if you had your web server and you hadn’t patched it for vulnerabilities and were using default credentials. It was just easy pickings for anyone who wanted to get to it.
What made it unique was that someone found it, connected to it and used remote control software TeamViewer to increase the level of lye in the system. I guess eventually they would have found it could have caused a problem. They said in 24-36 hours if no one had detected it, it could have caused a problem to the drinking water and could have been harmful to people. But the odds of that happening were very, very small because it was such a large increase. There were alarms all over the place.
We see a lot of use of TeamViewer and these other programs to allow for remote control. The large organizations will have some security around their remote access. They’ll typically have a VPN with two-factor authentication and such. But this is really a challenge for the small utilities, a small manufacturer or something like that where their IT team is so low that they don’t even think about having an OT team. So, they tend to not do things well. If anything, this pointed out to me that we as an industry haven’t done a good job of informing these small players about what the most important things that they to do.
TE: Yeah. That gets to one of the points that I wanted to cover. I think people, especially those outside of the information security industry or even OT itself, have struggled to understand how concerned they should be about this type of attack. Is this something that people should be worried about on a daily basis?
DP: From the standpoint of the general public, this sort of thing probably isn’t something they should spend a lot of time worrying about. But from a community standpoint, the people responsible for this, we obviously need to do a better job. You almost have to break it down into two categories. One is the truly large critical infrastructure that have been working on this problem and that need to do a better job. And then we have a lot of these small- to medium-sized ICS organizations that can’t afford to do the laundry list of things that would fall under good practice. And we need to help them understand these are the most important things you should do to reduce the chance of the attack.
TE: One of the results of this type of incident is that you’ve got the security community calling for better cyber hygiene for these types of environments. Is that the answer?
DP: That depends on what you mean by cyber hygiene. Speaking generally, I would say the answer is no. We’re not in a race to see who can put in the most security controls. The real thing we’re trying to do is manage risk to inappropriate level. So, if you spend a lot of precious time and effort doing things that don’t move the risk needle, then you’re not really accomplishing what you want. We need to be really clear on our messaging as to what they should do. We can’t just say “cyber hygiene.” We can’t say, Patch everything. Configure everything.” They’re not going to be able to do it. And quite frankly, some of that doesn’t really accomplish much.
TE: That’s a key difference between IT and OT security. Over the last couple of decades, we’ve seen a real push on the IT side for systems to be secure by design. And on the OT side, that’s completely different. Is that right?
DP: Until we solve these problems, until the control systems are more securable, then trying to make them secure is a losing game. Obviously, anything that would allow you to get inside the perimeter needs to be as hardened and as secure as possible. But once you’re inside, you’re only limited by your engineering and automation skills. There are no hacking skills required once you’re inside the perimeter.
TE: Yeah. You mentioned the idea of control systems needing to be more securable. Can you talk a little bit about what that means or how they’re not securable today?
DP: Well, a lot of it is as simple as a lack of authentication. This actually happened in the attack on Ukraine. They bricked the serial to ethernet gateways because there was a command to upload firmware that didn’t require authentication. So, they just uploaded bad firmware. And the thing stopped working.
TE: Yeah. And I think for the folks on the IT security side who aren’t familiar with OT, it’s really hard to conceptually understand how OT might work in those cases. This idea that you could upload firmware without any authentication or make those kinds of changes without authorization is in many ways a foreign concept.
DP: It’s something that has been a well-known fact but just wasn’t really considered to be an issue for a long time in the OT world, as well. They said, “Yes, that’s just the way it was.” And you even had large organizations like Siemens and large protocols whose security advice was essentially to keep the bad guys out. But then you have maybe 1A and 1B: detect when they’re in and be able to recover if they got in.
The Risk Equation for OT Cyber Security
TE: This gets us back to the topic that you suggested we get to at some point in here: reducing the impact or consequences. How does that play into the OT cyber security side of things?
DP: This is a really a big thing with Oldsmar. When you look at the risk equation, one version of it is like likelihood times consequence. And security people immediately leap to more security controls to reduce the likelihood. But if you can reduce the consequence, you put a cap on your risk because the likelihood can’t be higher than one. It’s easier to under-explain to management. “Hey. Worst case. This is what would happen.” And it tends to be less hand-wavy; you can actually prove this is the worst that can happen.
TE: There’s an interesting corollary here for the IT security folks. The corollary is around ransomware. We spend a lot of effort trying to prevent cyberattacks of different types, but we’ve gotten to a point where we start thinking about consequences with ransomware and the idea that even if the attack is successful, we might limit lateral movement, and we might be in a position to recover from that attack. Yes, it’s going to be painful. We’ll have to take orders on paper or whatever. But that limits the consequence, which is a corollary I hadn’t thought of until now.
DP: That’s exactly right. Reducing recovery time is another way to reduce consequence. You really have to think about it. On the IT side, if you can’t accept the impact of ransomware affecting all of your computers, then you probably haven’t thought this through.
TE: Yeah. And it does go to that conversation about how you allocate your resources. So often with security, we think more resources is always better, but we lose track of that equation. And we lose track of the fact that the business has a mission that may not actually require perfect security.
DP: Yes. And I think you’re really onto something there, too.
One of the other things I see that’s heading in the wrong direction here is that you get a bunch of security people in the room. We like more and more controls. We say, “Security is everyone’s problem.” And we start to say, “We are now going to require our operators, our engineers, these people to do these extra 10 steps related to security.” And that’s probably just going down a road to failure. We should actually be trying to reduce the burden of security on these people, because anything we can automate so that it doesn’t require a person to do it for the thing to be secure is probably going to improve our situation.
TE: One last question for you, Dale. There’s this trend or conversation about the IT-OT convergence, insecurity in particular. Who should actually own security in OT environments?
DP: Well, it really should be whoever the board assigns. So, the board is responsible for risk or executive management. If you don’t have a board, they typically look at someone for cyber-related risk. And then that’s the person that has to drive the program. They are the one that makes the decisions in the end and reports up to the board.
TE: Yeah. Ultimately, it’s not a a preference for who owns it but just that someone should. And if in your organization you’re not sure who that is, then maybe the question needs to be answered. All right. Well, this has been a very interesting conversation. Dale, I want to thank you for your time. And hopefully, it was interesting for our listeners, as well.
DP: You’re welcome Tim.
TE: And thanks to everyone for listening. I hope you’ll tune in for the next episode of the Tripwire Cybersecurity Podcast.