There was an interesting take from an academic who said AI could train using compressed data, which was not only necessary because of finite resources, but much more efficient with minimal loss in the end result.
The bubble could burst for memory, storage, and high bandwidth networking if this proves true and the industry/tech stack moves in that direction.Reply
If by compressed data, you mean the research being done to prune datasets and optimize model sizes. Ideally, model sizes can drop dramatically, e.g. 90%, with imperceptible quality loss. Possibly even an improvement if junk data is taken out.Reply
The training RAM bottleneck isn't really the data itself, but the model weights and the training process.
There are lots of experiments with quantization, pruning, backwards pass free training, data augmentation/filtering and such, but at the end of the day you *have* to pass through billions of parameters for each training/inference pass.Reply
Would it be possible to get an idea/range of the Average Selling Price (ASP) for 16GB and 24GB HBM2E / HBM3, and to compare it to SK Hynix 24GB LPPDR5X for example ?
It is to get an idea / order of magnitude of how much more expensive for same capacity HBM memory is versus LPDDR5 memory…Reply
You've got your labels backwards in this paragraph, compared to the graphic displayed right after it:
"TrendForce believes that HBM3 will account for 50% of all HBM memory shipped in 2023, whereas HBM2E will account for 39%. In 2024, HBM3 is poised to account for 60% of all HBM shipments. This growing demand, when combined with its higher price point, promises to boost HBM revenue in the near future."
Should be HBM3 at 39% (black) and HBM2e at 50% (dark green) in 2023.Reply
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
9 Comments
Back to Article
Sivar - Wednesday, August 9, 2023 - link
"but new production lines will likely start operations only in Q2 2022"Date typo? Reply
PeachNCream - Wednesday, August 9, 2023 - link
It's Anton, so yes, there are typos ;) ReplyRudde - Thursday, August 10, 2023 - link
Reading further, the date is specified as Q2 2024. Replynandnandnand - Thursday, August 10, 2023 - link
Pray for overproduction, AI bubble bursts, prices drop, and consumer products can include HBM. Ha ha. ReplyLiKenun - Thursday, August 10, 2023 - link
There was an interesting take from an academic who said AI could train using compressed data, which was not only necessary because of finite resources, but much more efficient with minimal loss in the end result.The bubble could burst for memory, storage, and high bandwidth networking if this proves true and the industry/tech stack moves in that direction. Reply
nandnandnand - Thursday, August 10, 2023 - link
If by compressed data, you mean the research being done to prune datasets and optimize model sizes. Ideally, model sizes can drop dramatically, e.g. 90%, with imperceptible quality loss. Possibly even an improvement if junk data is taken out. Replybrucethemoose - Friday, August 11, 2023 - link
The training RAM bottleneck isn't really the data itself, but the model weights and the training process.There are lots of experiments with quantization, pruning, backwards pass free training, data augmentation/filtering and such, but at the end of the day you *have* to pass through billions of parameters for each training/inference pass. Reply
Diogene7 - Friday, August 11, 2023 - link
Would it be possible to get an idea/range of the Average Selling Price (ASP) for 16GB and 24GB HBM2E / HBM3, and to compare it to SK Hynix 24GB LPPDR5X for example ?It is to get an idea / order of magnitude of how much more expensive for same capacity HBM memory is versus LPDDR5 memory… Reply
phoenix_rizzen - Tuesday, August 29, 2023 - link
You've got your labels backwards in this paragraph, compared to the graphic displayed right after it:"TrendForce believes that HBM3 will account for 50% of all HBM memory shipped in 2023, whereas HBM2E will account for 39%. In 2024, HBM3 is poised to account for 60% of all HBM shipments. This growing demand, when combined with its higher price point, promises to boost HBM revenue in the near future."
Should be HBM3 at 39% (black) and HBM2e at 50% (dark green) in 2023. Reply