Linux Foundation works toward improved data-center efficiency


Organizations exploring the use of data-processing units (DPU) and infrastructure processing units (IPU) got a boost this week as the Linux Foundation announced a project to make them integral to future data-center and cloud-based infrastructures.

DPUs, IPUs, and smartNICs are programmable networking devices designed to free-up CPUs for better performance in software-defined cloud, compute, networking, storage and security services.

The new plan, called the Open Programmable Infrastructure (OPI) Project calls for creating a community that develops standards for building DPU/IPU-based  architectures. OPI will develop technoogies designed to simplify network, storage, and security APIs within applications to enable more portable and efficient applications in the cloud and data center across DevOps, SecOps and NetOps, the Linux Foundation stated. 

Founding members of OPI include Dell Technologies, F5, Intel, Keysight Technologies, Marvell, NVIDIA, and Red Hat. OPI joins others such as AWS and AMD working to build smartNICs and DPUs for deployment in edge, colocation, or service-provider networks.

“DPUs and IPUs are great examples of some of the most promising technologies emerging today for cloud and data center, and OPI is poised to accelerate adoption and opportunity by supporting an ecosystem for DPU and IPU technologies,” said Mike Dolan, senior vice president of Projects at the Linux Foundation.

OPI goals include:

  • Delineating vendor-agnostic frameworks and architectures for DPU- and IPU-based software stacks applicable to any hardware solutions.
  • Enabling the creation of a rich open-source application ecosystem.
  • Integrating with existing open-source projects aligned to the same vision such as the Linux kernel.
  • Creating new APIs for interaction with, and between, the elements of the DPU and IPU ecosystem, including hardware, hosted applications, host node, and the remote provisioning and orchestration of software.

According to Dolan, DPUs and IPUs are increasingly being used to support high-speed network capabilities and packet processing for applications like 5G, AI/ML, Web3, crypto, and more because of their flexibility in managing resources across networking, compute, security and storage domains. Instead of servers being the infrastructure unit for cloud, edge, and the data center, operators could create pools of disaggregated networking, compute, and storage resources supported by DPUs, IPUs, GPUs, and CPUs to meet their customers’ application workloads and scaling requirements.

As part of the OPI announcement NVIDIA contributed its DOCA networking software APIs to the project. DOCA includes drivers, libraries, services, documentation, sample applications, and management tools to speed up and simplify the development and performance of applications, NVIDIA stated. DOCA allows for flexibility and portability for BlueField applications written using accelerated drivers or low-level libraries, such as DPDK, SPDK, Open vSwitch or Open SSL. BlueField is NVIDIA’s data-center services accelerator package.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Copyright © 2022 IDG Communications, Inc.



Source link