Nvidia introduces ‘ridesharing for AI’ with DGX Cloud Lepton

The platform is currently in early access but already CoreWeave, Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, Nscale, SoftBank, and Yotta have agreed to make “tens of thousands of GPUs” available for customers.

Developers can utilize GPU compute capacity in specific regions for both on-demand and long-term computing, supporting strategic and sovereign AI operational requirements. Nvidia expects leading cloud service providers and GPU marketplaces to also participate in the DGX Cloud Lepton marketplace.

The platform utilizes Nvidia AI software stack, including NIM and NeMo microservices, Nvidia Blueprints and Nvidia Cloud Functions, to accelerate and simplify the development and deployment of AI applications.



Source link

Leave a Comment