- I choose this budget wireless iPhone charger over Apple's MagSafe model - here's why
- AI導入を急ぐあまり見過ごされる8つのセキュリティリスク
- Grab this 230-piece Craftsman toolset for just $99 at Lowe's
- The best Apple deals of May 2025: iPhones, Apple Watches, iPads, and more
- Why I recommend this OnePlus phone over the S25 Ultra - especially at this price
Nvidia introduces ‘ridesharing for AI’ with DGX Cloud Lepton

The platform is currently in early access but already CoreWeave, Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, Nscale, SoftBank, and Yotta have agreed to make “tens of thousands of GPUs” available for customers.
Developers can utilize GPU compute capacity in specific regions for both on-demand and long-term computing, supporting strategic and sovereign AI operational requirements. Nvidia expects leading cloud service providers and GPU marketplaces to also participate in the DGX Cloud Lepton marketplace.
The platform utilizes Nvidia AI software stack, including NIM and NeMo microservices, Nvidia Blueprints and Nvidia Cloud Functions, to accelerate and simplify the development and deployment of AI applications.