HPE, Dell launch another round of AI servers

First up is a series of new PowerEdge servers. The PowerEdge XE9712 offers high-performance, dense acceleration for LLM training and real-time inferencing of large-scale AI deployments. It uses Nvidia’s GB200 NVL72, with up to 36 NVIDIA Grace CPUs with 72 NVIDIA Blackwell GPUs in a rack-scale design. The 72 GPUs are connected via NVLink domain, which acts as a single GPU for up to 30x faster real-time, trillion-parameter LLM inferencing.

The Dell PowerEdge M7725 is designed for high performance dense compute, which is ideal for research, government, fintech and higher education environments, according to Dell. The Dell PowerEdge M7725 scales between 24K-27K cores per rack, with 64 or 72 two-socket nodes using 5th generation AMD Epyc processors. It uses both direct liquid cooling and air cooling.

In addition to compute, Dell is offering unstructured storage and data management through its PowerScale storage devices to improve AI application performance and deliver simplified global data management.

The new PowerScale features faster metadata and the Dell Data Lakehouse discovery, while new 61TB drives increase capacity and efficiency and reduces data center storage footprint by half. PowerScale also adds InfiniBand capabilities and 200GbE Ethernet adapter support that delivers up to 63% faster throughput.

To mount all this hardware, Dell is introducing the Integrated Rack 7000 (IR7000), which handles accelerated computing demands with greater density, more sustainable power management, and advanced cooling technologies. It’s based on Open Compute Project (OCP) standards.

The IR7000 rack was built for liquid cooling natively and is capable of cooling future deployments of up to 480KW. It’s able to capture nearly 100% of heat created, according to Dell. It supports both Dell and off-the-shelf networking and is an integrated plug-and-play rack-scale system.



Source link

Leave a Comment