- 구글 클라우드, 구글 워크스페이스용 제미나이 사이드 패널에 한국어 지원 추가
- The best MagSafe accessories of 2024: Expert tested and reviewed
- Threads will show you more from accounts you follow now - like Bluesky already does
- OpenAI updates GPT-4o, reclaiming its crown for best AI model
- Nile unwraps NaaS security features for enterprise customers
Nvidia and Equinix partner for AI data center infrastructure
Nvidia is partnering with data center giant Equinix to offer what the vendors are calling Equinix Private AI with Nvidia DGX, a turnkey solution for companies that are looking to get into the generative AI game but lack the data center infrastructure and expertise to do it.
As part of the deal, Equinix hosts and manages Nvidia DGX supercomputers purchased by businesses. So it is a standard colocation type of agreement, but instead of traditional x86 servers, Equinix is hosting Nvidia GPU clusters, which start at six figures and shoot up to millions in cost.
It’s made for businesses that don’t want their data in the public cloud for various reasons, including security, data sovereignty and auditability, said Charlie Boyle, vice president of DGX systems at Nvidia, in a conference call with journalists.
“Lots of enterprises don’t have the expertise needed to build these very complex clusters of systems,” he said. “Most enterprise companies are sitting on a massive amount of enterprise data, sometimes decades of data. And all that data needs to be very close to the AI processing that they’re trying to accomplish.”
The ramp to AI is a steep and long one, it’s very expensive, and talent is hard to come by. All of these are inhibitors to companies adopting and deploying comprehensive AI. This is a fully managed service, and Equinix is one of the largest data center providers in the world, with 250 facilities in 71 metropolitan areas.
“Many customers have the desire to have this capability within their company,” said John Lin, executive vice president and general manager of data center services at Equinix. “Customers can really reduce their time, from the moment that they have the idea that they want this AI infrastructure, to the time that they actually get it up and running.”