Intel’s Gaudi 3 Chips Take on Nvidia in the AI Arena

Intel recently introduced its latest artificial intelligence chip, called Gaudi 3. This comes at a time when chipmakers are in a hurry to create semiconductors that can handle the training and deployment of large AI models, like the one used by OpenAI’s ChatGPT.

According to Intel, the new Gaudi 3 chip is more than twice as efficient in power usage compared to Nvidia’s H100 GPU.

It’s also capable of running AI models one-and-a-half times faster.

AMD joins the AI chip market with MI300X, attracting Meta and Microsoft as customers. (Credits: AMD)

The chip comes in various configurations, such as a package containing eight Gaudi 3 chips on a single motherboard or a card that can fit into existing systems.

Intel tested the chip with different models like Meta’s Llama and Falcon, supported by Abu Dhabi.

They found that Gaudi 3 can be useful for training or deploying various models, including Stable Diffusion and OpenAI’s Whisper model for speech recognition.

Additionally, Intel claims that its chips consume less power compared to Nvidia’s.

Nvidia currently dominates the AI chip market with around 80% share, primarily with its GPUs, which have been the preferred choice for AI developers in recent times.

Intel collaborates with industry giants to develop open software, aiming for flexible chip solutions. (Credits: Intel)

Intel announced that they’re coming out with new Gaudi 3 chips. These chips will be available for customers to buy in the third quarter.

Big companies like Dell, HP, and Supermicro will make computers with these chips, but Intel hasn’t said how much they’ll cost yet.

According to CNBC, in a conversation with reporters Das Kamhout, vice president of Xeon software at Intel, said, “We do expect it to be highly competitive” with Nvidia’s latest chips.

“From our competitive pricing and our distinctive open integrated network on chip, we’re using industry-standard Ethernet. We believe it’s a strong offering,” he added.

People are really into using artificial intelligence (AI) in big computer systems, like those used by cloud companies and businesses.

This means there’s a demand for different companies to make AI chips, not just Nvidia.

Buying Nvidia chips for AI stuff can be pricey, so companies are interested in finding other options to save money.

Nvidia introduces B100 and B200 GPUs, promising enhanced performance for AI applications. (Credits: X)

Nvidia’s stock has been going up a lot lately because they’re doing so well with their AI chips. Meanwhile, Intel’s stock hasn’t gone up as much.

Another company, AMD, also wants to sell more AI chips for servers. They already have some big customers like Meta and Microsoft.

Nvidia recently announced some new GPUs called B100 and B200, which are supposed to be better than their old ones. These chips should be available later this year.

Nvidia has been successful partly because they have special software called CUDA that helps people use their chips for AI tasks.

Intel is teaming up with other big companies like Google, Qualcomm, and Arm to create open software that anyone can use, not just Nvidia’s stuff.

The Gaudi 3 chips are made using a fancy manufacturing process called five nanometers. This means they’re made very precisely. Intel is also planning to make AI chips for other companies at a new factory in Ohio, which will open in a few years.

Sajda Parveen
Sajda Parveen
Sajda Praveen is a market expert. She has over 6 years of experience in the field and she shares her expertise with readers. You can reach out to her at [email protected]
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x