Nvidia CEO Reveals Latest AI Chip Priced at Over $30,000

Nvidia’s upcoming iteration of its artificial intelligence-centric graphics processor, dubbed Blackwell, will come with a price tag ranging between $30,000 and $40,000 per unit, as revealed by CEO Jensen Huang in a conversation with CNBC’s Jim Cramer.

“This will cost $30 to 40 thousand dollars,” Huang affirmed, showcasing the Blackwell chip.

“We had to invent some new technology to make it possible,” he elaborated, estimating Nvidia’s expenditure on research and development at around $10 billion.

The pricing hints at a comparable range to its forerunner, the H100, or the “Hopper” generation, which analysts estimate to have been priced between $25,000 and $40,000 per chip.

Nvidia CEO Reveals Latest AI Chip Priced at Over $30,000
Hopper generation: Significant price hike for Nvidia’s AI chips, introduced in 2022. (Credits: Nvidia)

The introduction of the Hopper generation in 2022 marked a notable price escalation for Nvidia’s AI chips compared to the previous generation.

Nvidia typically reveals a fresh generation of AI chips approximately every two years. These latest iterations, like Blackwell, generally boast improved speed and energy efficiency. Nvidia leverages the buzz around a new generation to amass orders for new GPUs.

Blackwell, for instance, integrates two chips and is physically larger than its predecessor.

The proliferation of Nvidia’s AI chips has propelled a threefold increase in quarterly Nvidia sales since the advent of the AI boom in late 2022, coinciding with the introduction of OpenAI’s ChatGPT.

Nvidia CEO Reveals Latest AI Chip Priced at Over $30,000
Blackwell AI accelerator: Three versions announced, featuring varied memory configurations, slated for release. (Credits: Nvidia)

Many leading AI companies and developers have relied on Nvidia’s H100 to train their AI models over the past year. Meta, for instance, announced its procurement of hundreds of thousands of Nvidia H100 GPUs earlier this year.

Nvidia refrains from disclosing the list price for its chips, which are available in various configurations.

The price paid by end consumers such as Meta or Microsoft hinges on factors like the chip volume purchased and whether the customer procures the chips directly from Nvidia as part of a complete system or through vendors like Dell, HP, or Supermicro, which assemble AI servers. Some servers incorporate as many as eight AI GPUs.

On Monday, Nvidia announced at least three distinct versions of the Blackwell AI accelerator—a B100, a B200, and a GB200 that pairs two Blackwell GPUs with an Arm-based CPU. These variants feature slightly different memory configurations and are slated for release later this year.

Sajda Parveen
Sajda Parveen
Sajda Praveen is a market expert. She has over 6 years of experience in the field and she shares her expertise with readers. You can reach out to her at [email protected]
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x