- What is AI networking? How it automates your infrastructure (but faces challenges)
- I traveled with a solar panel that's lighter than a MacBook, and it's my new backpack essential (and now get 23% off for Black Friday)
- Windows 11 24H2 hit by a brand new bug, but there's a workaround
- This Samsung OLED spoiled every other TV for me, and it's $1,400 off for Black Friday
- How to Protect Your Social Media Passwords with Multi-factor Verification | McAfee Blog
Architecting Edge Computing Infrastructure Is Simplified With New Vertiv Report and Digital Configuration Tool
Edge computing is a simple concept. Move processing and storage closer to devices and users to better manage the tsunami of data being generated and consumed across your enterprise and enable new digital applications.
But it’s not so simple in execution. Edge use cases have different requirements that must be factored into your edge strategy. And edge computing sites are deployed in very different physical environments. While some edge use cases may be supported by IT racks in a regional colocation facility, others require putting IT in the back of a store, on the factory floor, or on a city street corner.
This complexity is compounded by the de-centralized nature of edge computing. Organizations that embark on an edge computing strategy — and our research indicates about half currently are — will typically need to deploy multiple edge sites to achieve their goals, increasing the importance of standardized and intelligent edge infrastructure.
Using archetypes and models for edge computing infrastructure configuration
As the edge of network expands, the need to streamline the process of configuring and deploying edge computing sites increases. The first step in that process was to categorize edge use cases based on their latency, bandwidth, availability, and security requirements. Vertiv addressed that challenge by defining four edge computing archetypes.
Using the archetypes, your organization can quickly identify the key attributes of your edge network based on whether your use case is primarily Data Intensive, Human-Latency Sensitive, Machine-to-Machine Latency Sensitive, or Life Critical.
The next step, which we’ve recently completed, was to define edge infrastructure models that enable practical decisions around physical infrastructure.
Each model plays a role in supporting various edge uses cases and has distinct infrastructure requirements. Regional Edge Data Centers, for example, are being employed to support Data Intensive and Human-Latency Sensitive use cases such as high-definition content distribution and cloud gaming.
For use cases that require lower latency than can be provided by a Regional Edge Data Center, Distributed Edge Data Centers and Micro Edge sites will be required. The Distributed Edge Data Center model offers higher availability than can typically be achieved at the Micro Edge but sacrifices some of the latency reduction provided by that model. Micro Edge sites enable deployment closer to data sources to enable the ultra-low latency many use cases require.
Simplifying edge computing infrastructure configuration
Through the lens of archetypes and models we can now more efficiently hone in on the key factors that must be addressed to effectively configure the physical infrastructure required to support edge computing. These include:
- Use case: The latency and availability requirements of the use case dictate what data center models need to be deployed.
- Location and environment: The data center model dictates the location and environment in which edge computing will operate.
- Number of racks: The closer the edge computing facility is to the point at which data is consumed or generated, the less compute and storage is generally required.
- Power requirements: Based on the number of racks and the physical environment of the site, power requirements can be determined, and the workload and physical characteristics of the infrastructure required to support a specific use case become clear.
If you’re thinking it would be nice to put all of that information in a digital tool that can determine the specific requirements of your use case, we’re one step ahead of you. In conjunction with our new report, Archetypes 2.0: Deployment-Ready Edge Infrastructure Models, we’ve launched a configuration tool designed to simplify the process of configuring edge infrastructure based on common use cases. Try it out today.
Copyright © 2021 IDG Communications, Inc.