Intel non competerà presto con il dominio dell'intelligenza artificiale di fascia alta di NVIDIA, Inizia a licenziarsi 2,200 Lavoratori negli Stati Uniti
Intel also claims that Gaudi 3 is as power-efficient as the H100 for large language model (LLM) inference with small token outputs and does even better with larger outputs. The company even suggests Gaudi 3 beats NVIDIA’s newer H200 in LLM inference throughput for large token outputs. Tuttavia, Gaudi 3 doesn’t match up to the H100 in overall floating-point operation throughput for 16-bit and 8-bit formats. For bfloat16 and 8-bit floating-point precision matrix math, Gaudi 3 colpi 1,835 TFLOPS in each format, while the H100 reaches 1,979 TFLOPS for BF16 and 3,958 TFLOPS for FP8.
In an interview with CRN, Anil Nanduri, head of Intel’s AI acceleration office, stated that purchasing decisions for AI training infrastructure have primarily focused on performance rather than cost.
“And if you think in that context, there is an incumbent benefit, where all the frontier model research, all the capabilities are developed on the de facto platform where you’re building it, you’re researching it, e tu sei, in essence, subconsciously optimizing it as well. And then to make that port over [to a different platform] is work.
The world we are starting to see is people are questioning the [return on investment], the cost, the power and everything else. This is where—I don’t have a crystal ball—but the way we think about it is, do you want one giant model that knows it all?”, Anil Nanduri, the head of Intel’s AI acceleration office.
Intel believes that for many businesses, the answer is “no” and they will instead opt for smaller models based on tasks with less performance demands. Nanduri said that while the Gaudi 3 non posso “catch up” to NVIDIA’s latest GPUs, from a head-to-head performance standpoint, Gaudi 3 chips are ideal to enable the right systems to run task-based and open source models.
On a different subject, Intel has announced major job cuts in several states as part of its wider plan to shrink its workforce. The company will eliminate 1,300 positions in Oregon, 385 in Arizona, 319 in California, e 251 in Texas. Intel has a workforce of over 23,000 in Oregon, 12,000 in Arizona, 13,500 in California, e 2,100 in Texas. The layoffs are set to take place over a 14-day period starting November 15.