- One of the best mid-range sports watches I've tested is on sale for Black Friday
- This monster 240W charger has features I've never seen on other accessories (and get $60 off this Black Friday)
- This laptop power bank has served me well for years, and this Black Friday deal slashes the price in half
- This power bank is thinner than your iPhone and this Black Friday deal slashes 27% off the price
- New Levels, New Devils: The Multifaceted Extortion Tactics Keeping Ransomware Alive
Learn How to Design and Operate Edge Applications
Join Flo on August 18 for Conquering the Edge, the first of three in a webinar series leading up to our Develop with Cisco event in October.
When you are reading about “the Edge,” do you always know what is meant by that term? The edge is a very abstract term. For example, to a service provider, the edge could mean computing devices close to a cell tower. For a manufacturing company, the edge could mean an IoT gateway with sensors located on their shop floor.
There is a need to categorize the edge more, and fortunately there are some approaches. One of them is coming from the Linux Foundation (LF Edge), which is categorizing the edge into a User Edge and Service Provider Edge. The Service Provider Edge means to provide server-based compute for the global fixed or mobile networking infrastructure. It is usually consumed as a service provided by a communications service provider (CSP).
The User Edge, on the other hand, is more complex. The computing devices are highly diverse, the environment can be different for each node, the hardware and software resources are limited, and the computing assets are usually owned and operated by the user. For example, in the sub-category On-Prem Data Center Edge, computing devices are very powerful and can be rack or blade servers located in local data centers. In the Constraint Device Edge sub-category, microcontroller-based computing devices (which you can find in modern refrigerators or smart light bulbs) are being used.
Overview of the Edge Continuum (based on the Linux Foundation Whitepaper
The Smart Device Edge
Currently, one of the most interesting Edge tiers is the Smart Device Edge. In this edge tier, cost-effective compact compute devices are being used especially for latency critical applications. For me, this is the true Internet of Things edge computing tier. You find these devices distributed in the field, for example in remote and rugged environments as well as embedded in vehicles.
Why is it hot right now? For me there are 3 key factors.
- New Use-Cases are emerging because the technology is ready for them. Or new technology was built to enable new use-cases. In either way, uses-cases such as automated theft protection, predictive maintenance, autonomous driving, digital signage and remote expert assistant can now be implemented using the latest technology developments.
- New Technology. The smart edge devices can be equipped nowadays with powerful computing hardware for a fair price and a suitable form-factor. Compare what your smartphone could do 10 years ago from now! The most important aspect here is that a graphics processing unit (GPU) is already that small and powerful and can be embedded in the device. This enables applications to leverage AI/ML, computer vision and AR/VR.
- Extending Cloud-native applications. The hardware of devices in the Smart Device Edge is not as powerful as a server in the data center, however, it is capable of containerization and virtualization, and therefore supports cloud-native software development which will play a large role in the future. Workloads and new features can be extended from the Cloud to these devices with edge native applications.
Edge Native Applications
Edge native applications are defined as “an application built natively to leverage edge computing capabilities, which would be impractical or undesirable to operate in a centralized data center” (see definition). These applications are usually distributed on multiple locations and therefore must be highly modular. They also need to be portable in order to run on different types of hardware in the field, and need to be developed for devices with limited hardware resources. These applications are increasingly leveraging cloud-native principles such containerization, microservice-based architecture, and Continuous Integration / Continuous Delivery (CI/CD) practices.
Another challenge is the deployment and management of these applications. A suitable application lifecycle management is important to provide horizontal scalability, ease the deployment, and even accelerate the development of edge native applications.
Join me for a free webinar
Edge Native Applications Are Conquering the Edge
Thursday, Aug 18th at 10:00 AM Pacific Time (UTC-07:00)
This webinar will focus on how cloud-native principles can be applied to applications at the edge from a development and operations point of view. You’ll get an understanding of how to design and operate edge applications, especially the smart device edge with its heterogeneous hardware and software needs. Use-cases and demos will be provided along the way!
In “Conquering the Edge,” I will use a development and operations perspective to show how cloud-native principles can be applied to applications at the edge. Register now
Related Resources
We’d love to hear what you think. Ask a question or leave a comment below.
And stay connected with Cisco DevNet on social!
LinkedIn | Twitter @CiscoDevNet | Facebook | YouTube Channel
Share: