Original Link: https://www.anandtech.com/show/20006/memory-makers-on-track-to-double-hbm-output-in-2023

TrendForce projects a remarkable 105% increase in annual bit shipments of high-bandwidth memory (HBM) this year. This boost comes in response to soaring demands from AI and high-performance computing processor developers, notably Nvidia, and cloud service providers (CSPs). To fulfill demand, Micron, Samsung, and SK Hynix are reportedly increasing their HBM capacities, but new production lines will likely start operations only in Q2 2022.

More HBM Is Needed

Memory makers managed to more or less match the supply and demand of HBM in 2022, a rare occurrence in the market of DRAM. However, an unprecedented demand spike for AI servers in 2023 forced developers of appropriate processors (most notably Nvidia) and CSPs to place additional orders for HBM2E and HBM3 memory. This made DRAM makers use all of their available capacity and start placing orders for additional tools to expand their HBM production lines to meet the demand for HBM2E, HBM3, and HBM3E memory in the future.

However, meeting this HBM demand is not something straightforward. In addition to making more DRAM devices in their cleanrooms, DRAM manufacturers need to assemble these memory devices into intricate 8-Hi or 12-Hi stacks, and here they seem to have a bottleneck since they do not have enough TSV production tools, according to TrendForce. To produce enough HBM2, HBM2E, and HBM3 memory, leading DRAM producers have to procure new equipment, which takes 9 to 12 months to be made and installed into their fabs. As a result, a substantial hike in HBM production is anticipated around Q2 2024, the analysts claim.

A noteworthy trend pinpointed by TrendForce analysts is the shifting preference from HBM2e (Used by AMD's Instinct MI210/MI250/MI250X, Intel's Sapphire Rapids HBM and Ponte Vecchio, and Nvidia's H100/H800 cards) to HBM3 (incorporated in Nvidia's H100 SXM and GH200 supercomputer platform and AMD's forthcoming Instinct MI300-series APUs and GPUs). TrendForce believes that HBM3 will account for 50% of all HBM memory shipped in 2023, whereas HBM2E will account for 39%. In 2024, HBM3 is poised to account for 60% of all HBM shipments. This growing demand, when combined with its higher price point, promises to boost HBM revenue in the near future.

Just yesterday, Nvidia launched a new version of its GH200 Grace Hopper platform for AI and HPC that uses HBM3E memory instead of HBM3. The new platform consisting of a 72-core Grace CPU and GH100 compute GPU, boasts higher memory bandwidth for the GPU, and it carries 144 GB of HBM3E memory, up from 96 GB of HBM3 in the case of the original GH200. Considering the immense demand for Nvidia's offerings for AI, Micron — which will be the only supplier of HBM3E in 1H 2024 — stands a high chance to benefit significantly from the freshly released hardware that HBM3E powers.

HBM Is Getting Cheaper, Kind Of

TrendForce also noted a consistent decline in HBM product ASPs each year. To invigorate interest and offset decreasing demand for older HBM models, prices for HBM2e and HBM2 are set to drop in 2023, according to the market tracking firm. With 2024 pricing still undecided, further reductions for HBM2 and HBM2e are expected due to increased HBM production and manufacturers' growth aspirations.

In contrast, HBM3 prices are predicted to remain stable, perhaps because, at present, it is exclusively available from SK Hynix, and it will take some time for Samsung to catch up. Given its higher price compared to HBM2e and HBM2, HBM3 could push HBM revenue to an impressive $8.9 billion by 2024, marking a 127% YoY increase, according to TrendForce.

SK Hynix Leading the Pack

SK Hynix commanded 50% of the HBM memory market in 2022, followed by Samsung with 40% and Micron with a 10% share. Between 2023 and 2024, Samsung and SK Hynix will continue to dominate the market, holding nearly identical stakes that sum up to about 95%, TrendForce projects. On the other hand, Micron's market share is expected to hover between 3% and 6%.

Meanwhile, (for now) SK Hynix seems to have an edge over its rivals. SK Hynix is the primary producer of HBM3, the only company to supply memory for Nvidia's H100 and GH200 products. In comparison, Samsung predominantly manufactures HBM2E, catering to other chip makers and CSPs, and is gearing up to start making HBM3. Micron, which does not have HBM3 in the roadmap,  produces HBM2E (which Intel reportedly uses for its Sapphire Rapids HBM CPU) and is getting ready to ramp up production of HBM3E in 1H 2024, which will give it a significant competitive advantage over its rivals that are expected to start making HBM3E only in 2H 2024.

Source: TrendForce

Log in

Don't have an account? Sign up now