- IT 리더가 지목한 AI 가치 실현의 최대 걸림돌은 ‘비용 관리’
- Los CIO consideran que la gestión de costes puede acabar con el valor de la IA
- 칼럼 | AI 에이전트, 지금까지의 어떤 기술과도 다르다
- The $23 Echo Dot deal is a great deal to upgrade your smart home this Black Friday
- Amazon's Echo Spot smart alarm clock is almost half off this Black Friday
Nvidia and Equinix partner for AI data center infrastructure
Nvidia is partnering with data center giant Equinix to offer what the vendors are calling Equinix Private AI with Nvidia DGX, a turnkey solution for companies that are looking to get into the generative AI game but lack the data center infrastructure and expertise to do it.
As part of the deal, Equinix hosts and manages Nvidia DGX supercomputers purchased by businesses. So it is a standard colocation type of agreement, but instead of traditional x86 servers, Equinix is hosting Nvidia GPU clusters, which start at six figures and shoot up to millions in cost.
It’s made for businesses that don’t want their data in the public cloud for various reasons, including security, data sovereignty and auditability, said Charlie Boyle, vice president of DGX systems at Nvidia, in a conference call with journalists.
“Lots of enterprises don’t have the expertise needed to build these very complex clusters of systems,” he said. “Most enterprise companies are sitting on a massive amount of enterprise data, sometimes decades of data. And all that data needs to be very close to the AI processing that they’re trying to accomplish.”
The ramp to AI is a steep and long one, it’s very expensive, and talent is hard to come by. All of these are inhibitors to companies adopting and deploying comprehensive AI. This is a fully managed service, and Equinix is one of the largest data center providers in the world, with 250 facilities in 71 metropolitan areas.
“Many customers have the desire to have this capability within their company,” said John Lin, executive vice president and general manager of data center services at Equinix. “Customers can really reduce their time, from the moment that they have the idea that they want this AI infrastructure, to the time that they actually get it up and running.”