- Innovator Spotlight: Sumo Logic
- The 35+ best Black Friday Nintendo Switch deals 2024
- Best Black Friday TV deals 2024: 75+ expert-selected deals on QLED, OLED, & more
- The camera I recommend to most new photographers is $180 off for Black Friday
- The fan-favorite 8TB T5 Evo SSD is almost 50% off at Samsung for Black Friday
Operationalizing Zero Trust Architecture
By Chaim Mazal, Chief Security Officer, Gigamon
Over the last few years, organizations have been hearing about a Zero Trust architecture (ZTA) as a means for IT leaders to take a more proactive approach to security. Yet various ZTA documentation, guidance and roadmaps are not one-size-fits-all. Each organization’s mission, environment, staffing and unique needs vary. This is why organizations should not only think about the ZTA guidance from a compliance perspective, but also from an operational perspective.
And this couldn’t come at a more critical time; 95% of organizations reported experiencing a ransomware attack in 2022.
So, how does an IT leader effectively implement and operationalize ZTA within their organization? Start by laying a solid foundation that will enable the three core building blocks: adaptability, data normalization, and visibility.
- Adaptability– IT environments will adapt and change as business, mission, and environmental requirements evolve. Organizations need to have constant and consistent end-to-end visibility into the environment as computing evolves and shifts between on-premises physical and virtual compute resources and multiple cloud service providers. The dynamic nature of software-defined networks (SDN) also requires that the visibility fabric be easily adaptable.
- Data normalization – Data normalization is a core component of building robust, accurate, and broad-based analytics across various data sources for on-premises networks, containers, and multiple cloud providers. This is an important step because artificial intelligence/machine learning-based (AI/ML) detection is only as good as the data used to train the classifiers. It is crucial to standardize and normalize data sources (such as logs) across all components of the environment so AI/ML-based detection engines can be used to help drive policy-based decisions on user and system behaviors. Wide variations of data and sources will make detection classifiers unreliable and inconsistent across an organization’s environment.
- Visibility – End-to-end visibility is another core component of ZTA that should be consistent and unified across the enterprise. I believe there are five critical areas where visibility is necessary:
- Cloud– Most organizations do or will leverage multiple cloud providers, and each may offer its own native, unique, and mutable log generation tools. Being able to standardize network and application visibility across networks on-premises and in the cloud will allow unified monitoring.
- Containers – The rapid adoption and flexibility of containers create gaps in visibility for security teams and gaps in an organization’s ZTA. The ability to monitor and extract communication from containers will help prevent them from being a haven for cyber threat actors in your environment.
- Hybrid – Mixed on-premises and cloud compute environments make it challenging to gain single-pane visibility that is standardized across various and disparate environments. As organizations continue to migrate to hybrid and multi-cloud environments, leveraging the power of network-derived intelligence is more important than ever. In fact, research confirms that 75 percent of enterprises consider deep observability critical to mitigating threats quickly and effectively.
- Endpoints – Visibility at the endpoint level offers a wealth of data and information. It is good to cross-reference other data sources to better identify advanced persistent threats, a good practice to get into, especially since data obtained from endpoints could be mutable if a device is compromised.
- IoT – Endpoints that can’t be covered by monitoring software, such as printers, IoT devices, appliances, and other operational technology (OT) devices, create blind spots unless a deep observability solution is in place.
It’s not a matter of if an attack will occur, but when. Taking proactive steps to implement a ZTA by leveraging these three building blocks (adaptability, data normalization, and visibility) enables IT leaders to more effectively avoid common implementation challenges while finding a solution that works best within their existing infrastructure. A bonus? The organization is able to fend off cyberattacks before it’s too late.
About the Author
Chaim Mazal, the Chief Security Officer of Gigamon. He is responsible for global security, information technology, network operations, governance, risk, compliance, internal business systems, as well as the security of Gigamon product offerings. Prior to joining Gigamon he held similar roles with several industry leaders, most recently at Kandi, where he was the SVP of Technology and CISO. Chaim is a lifetime member of the Open Web Application Security Project (OWASP) Foundation and currently sits on several advisory boards, including Cloudflare, Gitlab, and Lacework. Chaim holds a bachelor’s degree from the Rabbinical College of America.
Chaim can be reached on linked in at linkedin.com/in/cmazal and at our company website https://www.gigamon.com/