- Nvidia introduces ‘ridesharing for AI’ with DGX Cloud Lepton
- The Apple Watch Series 10 just hit its best price ahead of Memorial Day - here's where
- Microsoft upgrades Edge with 3 AI features - including a big one for PDFs
- Así es el orquestador de agentes de Adobe: personalización a escala para impulsar la creatividad
- New Malware on PyPI Poses Threat to Open-Source Developers
Nvidia's latest coup: All of Taiwan on its software

Nvidia co-founder and chief executive Jensen Huang kicked off the annual Computex conference in Taipei, Taiwan on Monday with several announcements showcasing how its technology is being deployed across the island nation. The sum total of the announcements is nothing less than Taiwan being a giant engine of Nvidia software.
Nvidia is “building accelerated computing and AI infrastructure in Taiwan,” the company said in a pre-briefing with media on Friday.
Also: Google offers AI certification for business leaders now – and the training is free
The announcements are in keeping with Nvidia CEO Jensen Huang’s focus on the rise of “sovereign AI,” where nation states build local computing resources for AI in order to have greater control over artificial intelligence and security.
Taiwan Semiconductor Manufacturing, the world’s largest contract chip maker, and Nvidia’s manufacturing partner for most of its chips, is adopting Nvidia’s Grace CPU chips and BlackWell GPUs along with Nvidia programs design to simulate the chips, called “cuLitho,” to run some parts of chip production, including computational lithograph and chip inspection.
Taiwan Semi director for computer-aided design, Jeff Wu, said, “Our collaboration with Nvidia represents a significant advancement in semiconductor process simulation. The computational acceleration from CUDA-X libraries and Nvidia Grace Blackwell will expedite process development by simulating complex manufacturing processes and device behaviors at lower cost.”
Also: Nvidia’s 70+ projects at ICLR show how raw chip power is central to AI’s acceleration
The country’s National Center for High-Performance Computing is using multiple Nvidia technologies to build its next supercomputer, including Nvidia HGX computer systems running the Grace-Blackwell chip combination connected by Nvidia’s Quantum Infiniband networking.
In a related win for Nvidia, the Center, along with Taiwanese contract computer makers Compal and Quantum, and computer system maker Super Micro, are using Nvidia’s CUDA-Q open-source software to run experiments in pursuit of quantum computing.
Also: Nvidia launches NeMo software tools to help enterprises build custom AI agents
Similarly, Taiwan Semi and Taiwanese manufacturers Delta Electronics, Foxconn, and Wistron said they are using Nvidia’s Omniverse simulation software to generate “digital twins” that will change how they plan their manufacturing facilities.
Foxconn, for example, has used the code “to design and simulate robot work cells, assembly lines, and entire factory layouts.”
Other announcements at the show emphasized Nvidia’s expanding influence and reach.
New software called DGX Cloud Lepton will be run by Nvidia to connect developers to available collections of GPUs running at Nvidia partners’ cloud computing facilities. The Lepton system acts as a marketplace where developers of artificial intelligence can look up providers such as SoftBank, CoreWeave, and Nscale that have available GPU capacity.
The Lepton software means that “the global GPU supply is intelligent and connected, delivering a virtual global AI factory at a planetary scale,” said Alexis Bjorlin, head of Nvidia’s DGX Cloud business, in the media briefing.
Also: Nvidia dominates in gen AI benchmarks, clobbering 2 rival AI chips
The Lepton service acts like a single dashboard, said Bjorlin, “giving developers the ability to deploy AI workloads securely wherever they choose.”
Given constant shortages of Nvidia’s most advanced chips, the Lepton service is a canny way for Nvidia to try to rationalize the tight supply of its parts amidst still raging demand for AI compute.
When asked if Microsoft’s Azure cloud service will participate in Lepton, Bjorlin was somewhat noncommittal, responding, “We do expect all users to be able to bring their compute and the platforms that they currently use. So it’s certainly an option on this platform.”
For all the news announced by Nvidia at Computex, see the Nvidia Computex news wrap-up.
Get the morning’s top stories in your inbox each day with our Tech Today newsletter.