AMD CEO presents Nvidia competitor chip and forecasts impressive growth

The company unveils the highly anticipated MI300 AI accelerator, targeting a rapidly growing market worth over $400 billion.

12/7/2023, 7:00 PM
Eulerpool News Dec 7, 2023, 7:00 PM

The US company Advanced Micro Devices Inc. presented a long-awaited range of AI accelerators at an event on Wednesday in San Jose, California, which, according to the company's statement, can run faster than competing products.

The so-called MI300 series is set to challenge Nvidia Corp.'s dominant position and was unveiled by CEO Lisa Su herself. Su also made an impressive forecast for the size of the AI chip market, which could reach over $400 billion in the next four years. This is more than twice the forecast AMD made in August and shows how quickly expectations for AI hardware are changing. The launch represents one of the most important moments in AMD's five-decade history and sets the stage for the confrontation with Nvidia in the fiercely competitive market of AI accelerators.

These chips help with the development of AI models by bombarding them with data, a task they handle more skillfully than traditional computer processors. Su said in an interview that it is now possible to create artificial intelligence at a human level, which is considered the ultimate goal of computer science. However, the introduction of this technology is only just beginning.

It will take time to assess its impact on productivity and other aspects of the economy, says Su. "The truth is, we are still very early," she explained. "This is not a passing trend. I firmly believe in it."

AMD is increasingly confident that the MI300 series can convince renowned technology companies, potentially redirecting billions of dollars towards the company. Among others, Microsoft Corp., Oracle Corp., and Meta Platforms Inc. are expected to be customers using the company's processors, as AMD announced.

Stocks of Nvidia fell by 2.3% to $455.03 on Wednesday at the stock exchange – a clear sign that investors perceive the new chip series as a threat. However, shares of AMD did not experience a corresponding increase. On a day when technology stocks were generally declining, shares of the company also dropped by 1.3% to $116.82.

The enormous demand for Nvidia chips from data center operators has helped the company reach new heights in terms of market value and market capitalization this year, which now exceeds $1.1 trillion. The big question now is how long the company will practically have the accelerator market to itself. AMD sees an opportunity: large language models, such as those used by AI chatbots like OpenAI's ChatGPT, require a huge amount of computer memory, and this is exactly where the company believes it has an advantage.

The new AMD chip has over 150 billion transistors and 2.4 times as much memory as Nvidia's H100, the current market leader. In addition, according to AMD, it also offers 1.6 times as much memory bandwidth, which further enhances performance.

Su explained that the new chip Nvidia's H100 is on par with training AI software and significantly better in practical application of this software. Although the company has confidence in the performance of its product, Su emphasized that it will not just be a head-to-head battle between two companies as many others will try to gain market share.

Nvidia is simultaneously developing its own next-generation chips. The H100 will be replaced by the H200 in the first half of 2022, which will have access to a new highly effective type of memory. This is expected to at least match some of the features of AMD's offering. And then, Nvidia is likely to introduce a completely new processor architecture to the market throughout the year.

AMD's forecast that the market for AI processors will develop into a value of $400 billion by 2024 underscores the boundless optimism in the artificial intelligence industry. In comparison, the entire chip sector is projected to reach a value of $597 billion by 2022, according to IDC. Just in August, AMD had given a more modest forecast of $150 billion for the same period.

But it will take some time until the company can claim a large portion of this market for itself. AMD has announced that its own revenue from accelerators will exceed $2 billion in 2024, while analysts expect the chip manufacturer's total revenue to be around $26.5 billion.

The chips are based on the type of semiconductors known as graphics processing units, or GPUs, which have primarily been used by gamers to achieve the most realistic experience. Their ability to perform a specific type of calculation quickly by executing many calculations simultaneously makes them the preferred choice for training AI software.

Professional-grade financial intelligence

20M+ securities. Real-time data. Institutional insights.

Trusted by professionals at Goldman Sachs, BlackRock, and JPMorgan

News