- Los CIO buscan perfeccionar la gobernanza de la IA a pesar de las dudas
- Chinese Air Fryers May Be Spying on Consumers, Which? Warns
- VMware launches VeloRAIN, using AI/ML to improve network performance
- Your dream programming job demands this language, every site agrees
- Broadcom launches VMware Tanzu Data Services
Nvidia and Equinix partner for AI data center infrastructure
Nvidia is partnering with data center giant Equinix to offer what the vendors are calling Equinix Private AI with Nvidia DGX, a turnkey solution for companies that are looking to get into the generative AI game but lack the data center infrastructure and expertise to do it.
As part of the deal, Equinix hosts and manages Nvidia DGX supercomputers purchased by businesses. So it is a standard colocation type of agreement, but instead of traditional x86 servers, Equinix is hosting Nvidia GPU clusters, which start at six figures and shoot up to millions in cost.
It’s made for businesses that don’t want their data in the public cloud for various reasons, including security, data sovereignty and auditability, said Charlie Boyle, vice president of DGX systems at Nvidia, in a conference call with journalists.
“Lots of enterprises don’t have the expertise needed to build these very complex clusters of systems,” he said. “Most enterprise companies are sitting on a massive amount of enterprise data, sometimes decades of data. And all that data needs to be very close to the AI processing that they’re trying to accomplish.”
The ramp to AI is a steep and long one, it’s very expensive, and talent is hard to come by. All of these are inhibitors to companies adopting and deploying comprehensive AI. This is a fully managed service, and Equinix is one of the largest data center providers in the world, with 250 facilities in 71 metropolitan areas.
“Many customers have the desire to have this capability within their company,” said John Lin, executive vice president and general manager of data center services at Equinix. “Customers can really reduce their time, from the moment that they have the idea that they want this AI infrastructure, to the time that they actually get it up and running.”