Nvidia expands partnership with hyperscalers to boost AI training and development

Other collaborations between AWS and Nvidia include the use of Nvidia’s BioNeMo foundational model for generative chemistry, protein structure prediction, and understanding how drug molecules interact with targets via AWS’ HealthOmics offering. The two companies’ healthcare teams are also working together to launch generative AI microservices to advance drug discovery, medtech, and digital health, they said.

Google Cloud to get Blackwell-powered DGX Cloud

Google Cloud Platform, like AWS, will be getting the new Blackwell GPU platform and integrating Nvidia’s NIM suite of microservices into Google Kubernetes Engine (GKE) to speed up AI inferencing and deployment. In addition, Nvidia DGX Cloud is now generally available on Google Cloud A3 VM instances powered by NVIDIA H100 Tensor Core GPUs, Google and Nvidia said in a joint statement.  

The two companies are also extending their partnership to bring Google’s JAX machine learning framework for transforming numerical functions to Nvidia’s GPUs. This means that enterprises will be able to use JAX for LLM training on Nvidia’s H100 GPUs via MaxText and Accelerated Processing Kit (XPK), the companies said.

In order to help enterprises with data science and analytics, Google said that its Vertex AI machine learning platform will now support Google Cloud A3 VMs powered by Nvidia’s H100 GPUs and G2 VMs powered by Nvidia’s L4 Tensor Core GPUs.

“This provides MLops teams with scalable infrastructure and tooling to manage and deploy AI applications. Dataflow has also expanded support for accelerated data processing on Nvidia GPUs,” the companies said.

Oracle and Microsoft too

Other hyperscalers, such as Microsoft and Oracle, has also partnered with Nvidia to integrate the chipmaker’s hardware and software to beef up their offerings.

Not only are both companies adopting the Blackwell GPU platform across their services, they are also expected to see the adoption of Blackwell-powered DGX Cloud.



Source link