AMD‘s flagship AI GPUs, the Instincts, are being sold at a significant discount to Microsoft, offering a competitive alternative to Nvidia‘s pricier H100. According to Tom’s Hardware, Nvidia’s H100 AI GPUs cost up to four times more than AMD’s competing MI300X, with prices peaking beyond $40,000.
Despite the price advantage, AMD’s market strategy is not expected to significantly impact Nvidia’s stronghold in the AI GPU market. This is due to Nvidia’s established CUDA software stack, which has been optimized for a wide range of AI applications and workloads, resulting in overwhelming demand for its GPUs.
AMD recently raised its guidance for artificial intelligence-related revenue to $3.5 billion this year, up from $2 billion. However, Citi analyst Christopher Danely believes that AMD could be purposefully underestimating sales figures. He predicts that AMD will generate $5 billion this year and $8 billion next year from the MI300.
Big discount
Microsoft and Meta are reportedly the largest customers for the MI300, which was unveiled in December. The average selling price for Microsoft is roughly $10,000, while other customers pay $15,000 or more.
Despite AMD’s aggressive pricing strategy, risks remain. The PC space replenishment might be over, and Intel is catching up to Taiwan Semiconductor (TSM) in manufacturing.
Furthermore, AMD’s sales figures could be affected by the fact that Nvidia does not officially disclose the pricing of its H100 80GB products, which depends on various factors such as the volume of the batch and overall volumes procured by a particular client.
While AMD’s Instincts are massively cheaper than Nvidia’s H100, the impact on Nvidia’s dominant market position remains to be seen. With AMD’s sales of data center GPUs expected to exceed $3.5 billion, and supply still available, it provides a stark comparison to Nvidia’s rumored 52-week wait times. However, given Nvidia’s established software infrastructure and high demand, AMD’s price advantage may not be enough to significantly disrupt the market.