Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
AMD’s chief executive believes the chipmaker is closing the performance gap with Nvidia’s market-leading artificial intelligence processors, as it unveiled new products targeting a market worth hundreds of billions of dollars.
On Thursday the Silicon Valley-based group announced that its MI325X chip will roll out to customers in the fourth quarter of this year, saying it offers “industry-leading” performance compared to Nvidia’s current generation of H200 AI chips.
AMD’s next-generation MI350 chip, which aims to compete with Nvidia’s new Blackwell system, is on track to ship in the second half of 2025.
The US chipmaker has returned from the verge of bankruptcy a decade ago, when Lisa Su took over as chief executive, to emerge as the leading challenger to Nvidia’s grip on the infrastructure powering generative AI.
Su said her aim is for the company to become the “end-to-end AI leader” over the next 10 years. “You have to be extremely ambitious,” Su told the Financial Times. “This is the beginning, not the end of the AI race.”
It comes as Nvidia’s customers are expected to start deploying Blackwell in the current quarter, with Microsoft this week saying it had become the first cloud provider to offer the latest GB200 chips to its customers.
While the so-called ‘hyperscalers’ — Microsoft, Google and Amazon — are also building their own in-house AI chips, AMD has become Nvidia’s closest competitor in the race to offer off-the-shelf AI chips.
It remains a distant second, however. AMD’s projected $4.5bn in AI chip sales for 2024 is small compared to the $26.3bn in AI data centre chip sales that Nvidia made in the quarter to the end of July alone.
But Su is confident that demand will only grow over the coming years. The company has predicted the total addressable market for AI chips will reach $400bn by 2027.
“When we first started, that was viewed as a really big number,” Su said. “And I think people are moving towards our big number because of the tremendous demand there is for AI infrastructure.”
Chips are just one part of the infrastructure needed to build cutting-edge AI systems. AMD on Thursday also announced new networking technology and upgrades to its ROCm software toolkit, all of it aimed at offering AI infrastructure quickly and at scale.
“One of the things that we are really putting together is the end-to-end infrastructure for the data centre,” said Su. “People want a large cluster [of chips in a server] so you can train the largest language models.”
Su, who has a PhD in electrical engineering from Massachusetts Institute of Technology, worked at Texas Instruments, IBM and Freescale Semiconductor before joining AMD in 2012 as a senior vice-president.
When she took over as chief executive in 2014, AMD’s shares were languishing at around $4, with some analysts predicting it would be bankrupt in a few years as it struggled to compete with Intel.
Today, AMD has cornered a strong share of the server chip market and leapfrogged Intel on AI as it diversifies from its traditional PC business. AMD’s shares closed at $171 on Wednesday ahead of the announcement, giving it a market capitalisation of around $275bn — almost triple that of Intel.
Su sees AI as the primary driver of AMD’s next era of growth, and is seeking to land the same customers as Nvidia.
“People are really open to trying different architectures and seeing what fits their workload the best,” Su said. So far, both Microsoft and Meta have adopted AMD’s current generation of MI300 AI graphics processing units (GPUs).
Amazon, which is already a customer for AMD’s server CPUs, is likely to follow, Su said: “It’s a ‘point in time’ conversation.”
AMD’s approach parallels what Nvidia is doing with Blackwell, where it aims to sell not individual chips, but whole server racks made up of multiple chips, combined with Nvidia’s own networking equipment.
To catch up, AMD has pursued an aggressive investment and acquisition strategy, including the recent announcement of its $4.9bn acquisition of ZT Systems, which builds servers for the small group of AI hyperscalers.
In terms of potential regulatory reviews of the deal, Su said that “our current expectation is US-EU [checks], and there are a few other jurisdictions as well, but we don’t pass thresholds for China at the moment”.
Read the full article here