The announcement comes as companies and software makers still scramble to get their hands on the current generation of H100s and similar chips.on Monday announced a new generation of artificial intelligence chips and software for running AI models. The announcement, made during Nvidia's developer's conference in San Jose, comes as the chipmaker seeks to solidify its position as the go-to supplier for AI companies.
“Hopper is fantastic, but we need bigger GPUs,” Nvidia CEO Jensen Huang said on Monday at the company's developer conference in San Jose, California. “Let me introduce you to a very big GPU.” Das said Nvidia's new software will make it easier to run programs on any of Nvidia's GPUs, even older ones that might be better suited for deploying but not building AI.
The chip includes what Nvidia calls a"transformer engine specifically built to run transformers-based AI, one of the core technologies underpinning ChatGPT.. It will also be available as an entire server called the GB200 NVLink 2, combining 72 Blackwell GPUs and other Nvidia parts designed to train AI models.will sell access to the GB200 through cloud services. The GB200 pairs two B200 Blackwell GPUs with one Arm-based Grace CPU.
Nvidia will also sell B200 graphics processors as part of a complete system that takes up an entire server rack.Nvidia also announced it's adding a new product named NIM to its Nvidia enterprise software subscription.
Business Business Latest News, Business Business Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: CNBC - 🏆 12. / 72 Read more »