- 인텔, 컴퓨텍스 2025서 AI·워크스테이션용 신형 GPU·가속기 공개
- Buy a Sony Bravia 8 II, and get another 4K TV for free - but you'll need to act fast
- I choose this budget wireless iPhone charger over Apple's MagSafe model - here's why
- AI導入を急ぐあまり見過ごされる8つのセキュリティリスク
- Grab this 230-piece Craftsman toolset for just $99 at Lowe's
Nvidia introduces ‘ridesharing for AI’ with DGX Cloud Lepton

The platform is currently in early access but already CoreWeave, Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, Nscale, SoftBank, and Yotta have agreed to make “tens of thousands of GPUs” available for customers.
Developers can utilize GPU compute capacity in specific regions for both on-demand and long-term computing, supporting strategic and sovereign AI operational requirements. Nvidia expects leading cloud service providers and GPU marketplaces to also participate in the DGX Cloud Lepton marketplace.
The platform utilizes Nvidia AI software stack, including NIM and NeMo microservices, Nvidia Blueprints and Nvidia Cloud Functions, to accelerate and simplify the development and deployment of AI applications.