Spending on switches deployed in AI back-end networks for accelerated servers is forecast to surpass $100 B over the next five years (2025-2029), according to a new report from Dell’Oro Group. The market was led by Celestica, Huawei and NVIDIA in 2024 but potential major market shifts may unfold in 2025.
“Despite rising concerns about the future of spending on accelerated infrastructure amid DeepSeek recent announcements of its open-source model with significantly fewer resources than its U.S counterparts, we continue to project strong demand for high-end accelerators along with the infrastructure needed to support it”, said Sameh Boujelbene, Vice President at Dell’Oro Group. “Our updated forecast of data center switch sales in AI back-end networks was once again raised compared to our July 2024 forecast. This upward revision is however, favoring Ethernet, while our forecast for the InfiniBand portion of the market was reduced.
“Ethernet is gaining momentum, driven by both supply and demand tailwinds, as more large-scale AI clusters adopt it as the primary fabric. While the growing diversity of accelerators is boosting adoption, the more notable trend is that even large NVIDIA GPU-based clusters, like xAI’s Colossus, are being deployed with Ethernet. As a result, we have moved up our forecast for Ethernet’s crossover with InfiniBand by one year. In 2024, Celestica, Huawei, and NVIDIA led the Ethernet segment of the market, but we expect Accton, Arista, Cisco, Juniper, Nokia and other vendors to gain more traction in 2025, reshaping market dynamics,” continued Boujelbene.
Additional highlights from the AI Networks for AI Workloads January 2025 Report:
- The majority of the switch ports deployed in AI back-end networks are expected to be 800 Gbps by 2025, 1600 Gbps by 2027 and 3200 Gbps by 2030.
- While Tier 1 Cloud Service Providers will drive the majority of demand, we have significantly increased our forecast for Tier 2/3 providers and large enterprises due to their rapid AI workload adoption.