- Join Costco right now and get a $20 gift card with your membership
- The best iPhone 15 cases of 2024: Expert tested
- 5 ways ChatGPT can help you write essays
- AMD to cut 4% of workforce to prioritize AI chip expansion and rival Nvidia
- I'm a ChatGPT power user - and this is still my favorite productivity feature a month later
Arista financials offer glimpse of AI network development
AI cluster networking speeds are expected to grow from 200/400/800 Gbps today to over 1 Tbps in the near future, according to Sameh Boujelbene, vice president for ethernet switch market research at Dell’Oro Group.
Dell Oro forecasts that by 2025, the majority of ports in AI networks will be 800 Gbps, and by 2027, the majority of ports will be 1600 Gbps, showing a very fast adoption of the highest speeds available in the market. “This pace of migration is almost twice as fast as what we usually see in the traditional front-end network that is used to connect general-purpose servers,” Boujelbene stated in a recent report.
Arista believes it has a strong, three-pronged approach to grow networking speeds as needed and take advantage of the current growth in AI communications capabilities. Three key products – the Arista 7700 R4 Distributed Etherlink Switch, the 7800 R4 Spine switch, and the 7600X6 Leaf – are all in production and support 800GB as well as 400GB optical links.
Facebook’s parent company, Meta Platforms, helped develop the 7700 and recently said it would be deploying the Etherlink switch in its Disaggregated Scheduled Fabric (DSF), which features a multi-tier network that supports around 100,000 DPUs, according to reports. The 7700R4 AI Distributed Etherlink Switch (DES) supports the largest AI clusters, offering massively parallel distributed scheduling and congestion-free traffic spraying based on the Jericho3-AI architecture.
The 7060X6 AI Leaf switch features Broadcom Tomahawk 5 silicon with a capacity of 51.2 Tbps and support for 64 800G or 128 400G Ethernet ports, and the 7800R4 AI Spine utilizes Broadcom Jericho3-AI processors with an AI-optimized packet pipeline and supports up to 460 Tbps in a single chassis, which corresponds to 576 800G or 1152 400G Ethernet ports.
“This broad range of Ethernet platforms allows our customers to optimize density and minimize tiers to best match the requirements of their AI work,” said John McCool, Arista senior vice president and chief platform officer, during the financial call.