Intel has unveiled its new Gaudi 3 accelerators, which are set to significantly undercut Nvidia’s GPUs in the AI hardware market. The announcement was made at Computex, where Intel’s CEO Pat Gelsinger revealed the pricing for the next-gen Gaudi 2 and Gaudi 3 AI accelerator chips.
The flagship Gaudi 3 accelerator will cost around $15,000 per unit when purchased individually, which is 50 percent cheaper than Nvidia’s competing H100 data center GPU. The Gaudi 2, while less powerful, also undercuts Nvidia’s pricing dramatically. A complete 8-chip Gaudi 2 accelerator kit will sell for $65,000 to system vendors, which Intel claims is just one-third the price of comparable setups from Nvidia and other rivals. For the Gaudi 3, an 8-accelerator kit configuration costs $125,000, which Intel insists is two-thirds cheaper than alternative solutions at that high-end performance tier.
In terms of performance, Intel asserts that the Gaudi 3 keeps pace with or outperforms Nvidia’s H100 across a variety of important AI training and inference workloads. Benchmarks cited by Intel show the Gaudi 3 delivering up to 40 percent faster training times than the H100 in large 8,192-chip clusters. Even a smaller 64-chip Gaudi 3 setup offers 15 percent higher throughput than the H100 on the popular LLaMA 2 language model.
However, while the Gaudi chips leverage open standards like Ethernet for easier deployment, they lack optimizations for Nvidia’s ubiquitous CUDA platform that most AI software relies on today. To drive adoption, Intel says it has lined up at least 10 major server vendors, including new Gaudi 3 partners like Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron.
Read more: www.techspot.com