CONNECT WITH US

Global high-end AI server shipments

Jim Hsiao, DIGITIMES Research, Taipei 0

Credit: DIGITIMES

The DIGITIMES Research special report you are trying to open is sold a-la-carte and is also available with Servers Special Report Database services. Please sign in if you wish to continue.
In response to explosive generative AI and large language model (LLM) demand, major cloud service providers and leading server brands are stepping up efforts toward AI servers, with a focus on ramping up their procurement of high-end AI servers featuring accelerators with high bandwidth memory (HBM) integrated.
Abstract

DIGITIMES Research projects that over 80% of global high-end AI server shipments will be delivered to the global top-5 cloud service providers and server brands right out of the factories in 2023.

Among these firms, Microsoft, which invested in OpenAI's ChatGPT and devoted heavy resources to generative AI cloud services, will obtain the largest portion of the volumes with all the shipments procured directly from its ODM partners.

DIGITIMES Research believes tight CoWoS capacity, which is needed for packaging HBM, will result in the supply of high-end AI servers short of demand by more than 35% in 2023, but high-end AI server shipments worldwide are still expected to grow fivefold from a year ago to reach around 172,000 units.

Table of contents
Price: NT$27,000 (approx. US$900)