This Post Is Recently Updated on Nov 14, 2023 @ 10:26 am by TBB Desk
Nvidia is gearing up to release its premier AI processor, the H200, promising significant advancements in the second quarter of 2024. This new chip is expected to enhance performance markedly, including nearly a two-fold increase in processing speed for Meta’s Llama 2 language model compared to its predecessor, the H100. Major cloud providers are already lined up to integrate this new technology next year.
Nvidia’s stock saw a boost following the announcement, with a noticeable 1.5% rise. The H200 distinguishes itself with the introduction of HBM3e memory technology, boasting a robust 141GB memory capacity and a transfer rate of 4.8 terabytes per second. Amidst Nvidia’s dominance in the AI chip market, competitors like Advanced Micro Devices and Intel are stepping up, with AMD’s upcoming M1300 and Intel’s Gaudi 2 chips challenging Nvidia’s supremacy. Furthermore, companies such as OpenAI, known for ChatGPT, are exploring the creation of their own AI chips, signaling a competitive horizon in the AI processor market.