Meta and Microsoft to buy AMD’s new AI chip to replace Nvidia

  • Meta, OpenAI and Microsoft said they are using AMD’s new AI chip, the Instinct MI300X — a sign that tech companies are looking to replace expensive Nvidia graphics processors essential to artificial intelligence.
  • If the MI300X is good enough and cheap when it starts shipping early next year, it could reduce the cost of developing AI models.
  • AMD CEO Lisa Su predicts the market for AI chips will be worth $400 billion or more by 2027, and said she believes AMD has a significant portion of that market.

Lisa Su shows off the AMD Instinct MI300 chip during a keynote speech at CES 2023 on January 4, 2023 in Las Vegas, Nevada.

David Becker | Good pictures

Meta, OpenAI and Microsoft announced at an AMD investor event on Wednesday that they will use AMD’s new AI chip, the Instinct Mi300X. The biggest sign so far is that tech companies are looking for alternatives to the expensive Nvidia graphics processors needed to build and deploy artificial intelligence programs like OpenAI’s ChatGPT.

If AMD’s latest high-end chip is good enough for tech companies and cloud service providers to build and serve AI models when it starts shipping early next year, it could lower the costs of building AI models and put competitive pressure on Nvidia’s growing AI chip sales growth. .

“There’s all the interest in big iron and big GPUs for the cloud,” AMD CEO Lisa Su said Wednesday.

AMD claims that the MI300X is based on a new architecture that often leads to significant performance gains. Its most unique feature is that it has 192GB of state-of-the-art, high-performance HBM3 memory, which transfers data faster and is suitable for large AI models.

See also  NFL fines David Tepper $300K after Panthers owner throws drink at fans in Jacksonville

Su directly compared the MI300X and its built-in systems to Nvidia’s flagship AI GPU, the H100.

“What this efficiency means is that it translates directly into a better user experience,” Su said. “When you ask a model for something, you want it to come back fast, especially when the answers are more complex.”

A key question facing AMD is whether companies building on Nvidia will invest the time and money to add another GPU supplier. “Adoption of AMD requires work,” Su said.

AMD told investors and partners on Wednesday that it has improved its software package, called ROCm, to compete with Nvidia’s industry-standard CUDA software.

Price can also be important. AMD did not disclose pricing for the MI300X on Wednesday, but Nvidia’s chip costs $40,000, and Su told reporters that AMD’s chip would have to undercut Nvidia to persuade customers to buy it.

AMD MI300X Accelerator for Artificial Intelligence.

On Wednesday, AMD said it had already signed up some of the companies most hungry to use the GPUs chip. Meta and Microsoft were the two biggest buyers of Nvidia H100 GPUs in 2023. A recent report from research firm Omidia.

Meta said it uses the MI300X GPUs for AI inference workloads such as processing AI stickers, image editing and running its assistant.

Microsoft’s CTO, Kevin Scott, said the company will provide access to the MI300X chips through its Azure web service.

Oracle’s cloud also uses chips.

OpenAI said one of its software products will support AMD GPUs called Triton, which is not a large-scale language model like GPT, but is used in AI research to access chip features.

See also  European markets rise after record ECB rate hike

AMD isn’t predicting huge sales for the chip, with only about $2 billion in total data center GPU revenue in 2024. Nvidia posted more than $14 billion in data center sales in the most recent quarter alone. GPUs.

However, AMD says the total market for AI GPUs could grow to $400 billion over the next four years, doubling the company’s previous predictions. It shows how desirable high-end AI chips have become with high expectations — and why the company is now focusing investors’ attention on the product line.

Su also suggested to reporters that AMD doesn’t think it needs to beat Nvidia to do well in the market.

“I think it’s clear to say that Nvidia should be the majority now,” Su told reporters about the AI ​​chip market. “We believe it will be more than $400 billion by 2027. And we can get a good chunk of that.”

Leave a Reply

Your email address will not be published. Required fields are marked *