Skip to main content

NVIDIA Unveils New AI Chips

NVIDIA's new chip that can run 27 trillion parameter models!
Created on March 19|Last edited on March 19
Nvidia CEO Jensen Huang revealed the company's new generation of artificial intelligence chips, named Blackwell, set to ship later this year. This announcement, made at Nvidia’s developer conference in San Jose, highlights Nvidia's commitment to leading the AI hardware sector. The reveal is timely, as demand for Nvidia's current H100 series remains high amidst the ongoing AI boom sparked by ChatGPT.

GB200

The Blackwell series, starting with the GB200 chip, marks a significant advancement in AI computing, providing up to 20 petaflops of AI performance. This boost allows for the training of larger, more complex AI models than ever before. The introduction of Blackwell signifies a shift in Nvidia's role from merely a chip supplier to a comprehensive platform provider, akin to tech giants like Microsoft and Apple.

Services

Highlighting this transition, Nvidia has also unveiled NIM (Nvidia Inference Microservice), a new addition to their software suite designed to optimize the deployment of AI models across various Nvidia GPUs, including older models. This software aims to make it easier for developers to run and adopt their models across a broader range of Nvidia's hardware, ensuring compatibility and efficiency.

27 Trillion parameters

In an era where AI model complexity and size are rapidly increasing, Nvidia’s Blackwell processors, especially the GB200, represent a crucial step forward. The GB200, combining two B200 Blackwell GPUs with an Arm-based Grace CPU, exemplifies this leap, facilitating the training and deployment of monumental AI models, such as those with 27 trillion parameters.
Moreover, Nvidia's strategic move includes the introduction of server solutions like the GB200 NVLink 2, which integrates multiple Blackwell GPUs for enhanced AI training capabilities. Major cloud service providers, including Amazon, Google, Microsoft, and Oracle, have shown interest in incorporating these new offerings into their platforms, signaling a robust market reception.
Overall, it seems like NVIDIA is continuing to lead the pack in terms of GPU offerings for AI, and it will be interesting to see how models improve as more advanced compute becomes available.
Tags: ML News
Iterate on AI agents and models faster. Try Weights & Biases today.