Nvidia GTC 2024 wrap-up: Blackwell not the only big news

Dubbed Nvidia Inference Microservices (NIM), the software is part of Nvidia’s Enterprise AI software package. It consists of a package of optimized inference engines, industry-standard APIs, and support for AI models all bundled into containers for easy deployment. NIM provides prebuilt models as well as allows organizations to add their own proprietary data and models.

One thing you can say about this NIM technology is that Nvidia did not work in a vacuum. The company worked with many major software vendors, including SAP, Adobe, Cadence, CrowdStrike, and ServiceNow, as well as data platform vendors, including Box, Cohesity, Cloudera, Databricks, Datastax, and NetApp.

It offers inference processing on many of the popular AI models from Google, Meta, Hugging Face, Microsoft, Mistral AI and Stability AI. The NIM microservices will be available from Amazon Web Services, Google Kubernetes Engine, and Microsoft Azure AI.

Getting into storage validation

Storage is a key component of AI processing, because AI is nothing without copious amounts of data. To that end, Nvidia started a storage partner validation program designed to help businesses find the right storage solutions by offering certification for AI and graphics-intensive workloads. The program is called Nvidia OVX, a similar naming scheme to the DGX compute servers. The first batch of companies seeking OVX storage validation are DDN, Dell PowerScale, NetApp, Pure Storage and WEKA.

NVIDIA OVX servers combine high-performance, GPU-accelerated compute with high-speed storage access and low-latency networking to address a range of complex AI and graphics-intensive workloads. The program provides a standardized process for partners to validate their storage appliances.

Server makers jump on Blackwell

All of the major OEMs announced new Blackwell-based offerings.



Source link