Initially, I saw the "G" in front of GH200, and I thought "neat, a version with graphics units!". But, maybe it's just there because "Grace Hopper"? Lame.
Regarding the 3x memory bandwidth claim, I'm guessing that's in relation to a dual-populated "superchip" module, where each chip gets 5 TB/s, yielding 10 TB/s. That figure is then compared to the H100 SXM @ 3 TB/s. So, it's more like a 67% improvement per GPU.
It annoyed me to hear that claim, because I was sure HBM3e wasn't 3x as fast as even HBM2e, which is what the slowest H100 used.Reply
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
4 Comments
Back to Article
Scabies - Tuesday, August 8, 2023 - link
> NVIDIA's GH200 GPU is set to be the world's first chip, complete with HBM3e memoryLooks like AI isn't quite there yet. Reply
mode_13h - Tuesday, August 8, 2023 - link
Initially, I saw the "G" in front of GH200, and I thought "neat, a version with graphics units!". But, maybe it's just there because "Grace Hopper"? Lame.Regarding the 3x memory bandwidth claim, I'm guessing that's in relation to a dual-populated "superchip" module, where each chip gets 5 TB/s, yielding 10 TB/s. That figure is then compared to the H100 SXM @ 3 TB/s. So, it's more like a 67% improvement per GPU.
It annoyed me to hear that claim, because I was sure HBM3e wasn't 3x as fast as even HBM2e, which is what the slowest H100 used. Reply
lemurbutton - Wednesday, August 9, 2023 - link
Who cares? This is not marketed towards you. ReplyQasar - Sunday, August 13, 2023 - link
or you, as you will claim m1/m2/m3/m4 etc will beat it.besides, you wouldnt want to upset your bosses when you but something that isnt made by apple, would you ? Reply