The announcement comes as companies and software makers still scramble to get their hands on the current generation of H100s and similar chips.on Monday announced a new generation of artificial intelligence chips and software for running artificial intelligence models. The announcement, made during Nvidia's developer's conference in San Jose, comes as the chipmaker seeks to solidify its position as the go-to supplier for AI companies.
The company also introduced revenue-generating software called NIM that will make it easier to deploy AI, giving customers another reason to stick with Nvidia chips over "If you're a developer, you've got an interesting model you want people to adopt, if you put it in a NIM, we'll make sure that it's runnable on all our GPUs, so you reach a lot of people," Das said.Nvidia's GB200 Grace Blackwell Superchip, with two B200 graphics processors and one Arm-based central processor.
The chip includes what Nvidia calls a"transformer engine specifically built to run transformers-based AI, one of the core technologies underpinning ChatGPT.. It will also be available as an entire server called the GB200 NVLink 2, combining 72 Blackwell GPUs and other Nvidia parts designed to train AI models.will sell access to the GB200 through cloud services. The GB200 pairs two B200 Blackwell GPUs with one Arm-based Grace CPU.
Nvidia will also sell B200 graphics processors as part of a complete system that takes up an entire server rack.Nvidia also announced it's adding a new product named NIM to its Nvidia enterprise software subscription.
México Últimas Noticias, México Titulares
Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.
Fuente: CNBC - 🏆 12. / 72 Leer más »