- 칼럼 | AI 에이전트, 지금까지의 어떤 기술과도 다르다
- The $23 Echo Dot deal is a great deal to upgrade your smart home this Black Friday
- Amazon's Echo Spot smart alarm clock is almost half off this Black Friday
- The newest Echo Show 8 just hit its lowest price ever for Black Friday
- 기술 기업 노리는 북한의 가짜 IT 인력 캠페인··· 데이터 탈취도 주의해야
Microsoft’s Maia AI, Azure Cobalt chips to rev up efficiency, performance
Additionally, analysts believe that the new chips provide a window of opportunity for Microsoft to build its own AI accelerator software frameworks as demand for AI or generative AI grows further.
“Building accelerators for AI workloads will be a way to improve performance while using less power than other chips such as graphics processing units (GPUs). Increasing performance while being energy efficient will continue to be more important for vendors and enterprises as attempt to meet sustainability goals and benefit from the potential of AI,” Newman said.
Custom chips to give Nvidia, AMD and Intel a run for their money
Microsoft’s new custom chips are not powerful enough to replace GPUs from Nvidia for the purposes of developing large language models. But they are well-suited for inferencing — being used in operational AI workloads — and as they roll out they will reduce the need for the company to use chips from Nvidia, AMD and Intel, analysts said, adding that custom chips from AWS and Google will also challenge chipmakers in the future.
“Intel, NVIDIA, and AMD are all seeing the rise of Arm based instances and should see them as a competitive threat in certain instances,” Newman said.
The migration of workloads from x86 architecture chips to Arm isn’t plug and play yet — since software is often written for specific chip architectures — but has become less of a sticking point as developers continue to make progress in running more and more workloads on Arm, the Futurum Group’s Newman said.
Analysts say that with cloud service providers using custom silicon at varying levels, the data center market will see a “more meaningful shift” to Arm in the coming years despite x86 chips currently dominating market share by a substantial margin.
Among all chipmakers, Newman believes that Nvidia will be the least impacted, at least in the near term, as demand for its GPUs is set to remain elevated.
However, in some instances or use cases the custom chips from cloud service providers may see a symbiotic relationship with Nvidia, especially the Grace Hopper chips, which are targeted towards developing and training large language models.
Microsoft’s new custom chips are expected to start rolling out in early next year to its data centers. Since Microsoft does not plan to sell the chips to third parties, it will not have to deal with the restrictions imposed by the administration of US President Joe Biden on tech exports to China.