But while Nvidia remains an AI infrastructure titan, it’s facing stiffer competition than ever from rival AMD. Among earlier adopters of its Instinct MI300 series GPUs, AMD is quickly gaining share.
Despite growing share among key customers like Microsoft and Meta, AMD’s share of the broader GPU market remains comparatively small next to Nvidia.
How much of this thirst is driven by limited supply of Nvidia hardware is hard to say, but at least on paper, AMD’s MI300X accelerators offered a number of advantages. Introduced a year ago, the MI300X claimed 1.3x higher floating point performance for AI workloads, as well as 60 percent higher memory bandwidth and 2.4x higher capacity than the venerable H100.
Even with Nvidia’s Blackwell, which is only just beginning to reach customers, pulling ahead on performance and memory bandwidth, AMD’s new MI325X still holds a capacity advantage at 256 GB per GPU. Its more powerful MI355X slated for release late next year will push this to 288 GB.
Omdia expects Nvidia to struggle over the next year to grow its share of the AI server market as AMD, Intel, and the cloud service providers push alternative hardware and services.
“If we’ve learned anything from Intel, once you’ve reached 90-plus percent share, it’s impossible to continue to grow. People will immediately look for an alternative,” Galabov said.
It’s interesting that while analysts are making predictions around competitive dynamics between AMD/Intel/Nvidia/Custom CSP silicon, but few seem to entertain the possibility for a glut in AI computer hardware due to lack of revenue generation for “AI” services. Albeit, maybe this is not the type of article for this sort of thing.