World News

AMD reveals next-generation AI chips with OpenAI CEO Sam Altman

Lisa Su, CEO of Superior Micro Units, testifies throughout the Senate Commerce, Science and Transportation Committee listening to titled “Profitable the AI Race: Strengthening U.S. Capabilities in Computing and Innovation,” in Hart constructing on Thursday, Might 8, 2025.

Tom Williams | CQ-Roll Name, Inc. | Getty Photos

Superior Micro Units on Thursday unveiled new particulars about its next-generation AI chips, the Intuition MI400 collection, that can ship subsequent 12 months.

The MI400 chips will be capable to be assembled right into a full server rack known as Helios, AMD mentioned, which can allow 1000’s of the chips to be tied collectively in a approach that they can be utilized as one “rack-scale” system.

“For the primary time, we architected each a part of the rack as a unified system,” AMD CEO Lisa Su mentioned at a launch occasion in San Jose, California, on Thursday.

OpenAI CEO Sam Altman appeared on stage on with Su and mentioned his firm would use the AMD chips.

“Once you first began telling me concerning the specs, I used to be like, there is no approach, that simply sounds completely loopy,” Altman mentioned. “It is gonna be an incredible factor.”

AMD’s rack-scale setup will make the chips look to a person like one system, which is vital for many synthetic intelligence clients like cloud suppliers and corporations that develop massive language fashions. These clients need “hyperscale” clusters of AI computer systems that may span whole knowledge facilities and use large quantities of energy.

“Consider Helios as actually a rack that capabilities like a single, large compute engine,” mentioned Su, evaluating it towards Nvidia’s Vera Rubin racks, that are anticipated to be launched subsequent 12 months.

OpenAI CEO Sam Altman poses throughout the Synthetic Intelligence (AI) Motion Summit, on the Grand Palais, in Paris, on February 11, 2025. 

Joel Saget | Afp | Getty Photos

AMD’s rack-scale expertise additionally allows its newest chips to compete with Nvidia’s Blackwell chips, which already are available in configurations with 72 graphics-processing items stitched collectively. Nvidia is AMD’s major and solely rival in huge knowledge heart GPUs for growing and deploying AI purposes.

OpenAI — a notable Nvidia buyer — has been giving AMD suggestions on its MI400 roadmap, the chip firm mentioned. With the MI400 chips and this 12 months’s MI355X chips, AMD is planning to compete towards rival Nvidia on value, with an organization government telling reporters on Wednesday that the chips will value much less to function due to decrease energy consumption, and that AMD is undercutting Nvidia with “aggressive” costs.

Up to now, Nvidia has dominated the marketplace for knowledge heart GPUs, partially as a result of it was the primary firm to develop the form of software program wanted for AI builders to benefit from chips initially designed to show graphics for 3D video games. Over the previous decade, earlier than the AI increase, AMD targeted on competing towards Intel in server CPUs.

Su mentioned that AMD’s MI355X can outperform Nvidia’s Blackwell chips, regardless of Nvidia utilizing its “proprietary” CUDA software program.

“It says that we have now actually robust {hardware}, which we at all times knew, however it additionally reveals that the open software program frameworks have made super progress,” Su mentioned.

AMD shares are flat to date in 2025, signaling that Wall Avenue would not but see it as a serious risk to Nvidia’s dominance.

Andrew Dieckmann, AMD’s basic manger for knowledge heart GPUs, mentioned Wednesday that AMD’s AI chips would value much less to function and fewer to accumulate.

“Throughout the board, there’s a significant value of acquisition delta that we then layer on our efficiency aggressive benefit on high of, so vital double-digit share financial savings,” Dieckmann mentioned.

Over the subsequent few years, huge cloud corporations and international locations alike are poised to spend tons of of billions of {dollars} to construct new knowledge heart clusters round GPUs with the intention to speed up the event of cutting-edge AI fashions. That features $300 billion this 12 months alone in deliberate capital expenditures from megacap expertise corporations.

AMD is anticipating the entire marketplace for AI chips to exceed $500 billion by 2028, though it hasn’t mentioned how a lot of that market it will possibly declare — Nvidia has over 90% of the market presently, based on analyst estimates.

Each corporations have dedicated to releasing new AI chips on an annual foundation, versus a biannual foundation, emphasizing how fierce competitors has change into and the way vital bleeding-edge AI chip expertise is for corporations like Microsoft, Oracle and Amazon.

AMD has purchased or invested in 25 AI corporations prior to now 12 months, Su mentioned, together with the buy of ZT Methods earlier this 12 months, a server maker that developed the expertise AMD wanted to construct its rack-sized techniques.

“These AI techniques are getting tremendous sophisticated, and full-stack options are actually important,” Su mentioned.

What AMD is promoting now

At the moment, essentially the most superior AMD AI chip being put in from cloud suppliers is its Intuition MI355X, which the corporate mentioned began delivery in manufacturing final month. AMD mentioned that it might be accessible for hire from cloud suppliers beginning within the third quarter.

Corporations constructing massive knowledge heart clusters for AI need alternate options to Nvidia, not solely to maintain prices down and supply flexibility, but additionally to fill a rising want for “inference,” or the computing energy wanted for truly deploying a chatbot or generative AI software, which may use way more processing energy than conventional server purposes.

“What has actually modified is the demand for inference has grown considerably,” Su mentioned.

AMD officers mentioned Thursday that they imagine their new chips are superior for inference to Nvidia’s. That is as a result of AMD’s chips are geared up with extra high-speed reminiscence, which permits greater AI fashions to run on a single GPU.

The MI355X has seven occasions the quantity of computing energy as its predecessor, AMD mentioned. These chips will be capable to compete with Nvidia’s B100 and B200 chips, which have been delivery since late final 12 months.

AMD mentioned that its Intuition chips have been adopted by seven of the ten largest AI clients, together with OpenAI, Tesla, xAI, and Cohere.

Oracle plans to supply clusters with over 131,000 MI355X chips to its clients, AMD mentioned.

Officers from Meta mentioned Thursday that they had been utilizing clusters of AMD’s CPUs and GPUs to run inference for its Llama mannequin, and that it plans to purchase AMD’s next-generation servers.

A Microsoft consultant mentioned that it makes use of AMD chips to serve its Copilot AI options.

Competing on value

AMD declined to say how a lot its chips value — it would not promote chips by themselves, and end-users often purchase them by a {hardware} firm like Dell or Tremendous Micro Laptop — however the firm is planning for the MI400 chips to compete on value.

The Santa Clara firm is pairing its GPUs alongside its CPUs and networking chips from its 2022 acquisition of Pensando to construct its Helios racks. Which means larger adoption of its AI chips must also profit the remainder of AMD’s enterprise. It is also utilizing an open-source networking expertise to carefully combine its rack techniques, known as UALink, versus Nvidia’s proprietary NVLink.

AMD claims its MI355X can ship 40% extra tokens — a measure of AI output — per greenback than Nvidia’s chips as a result of its chips use much less energy than its rival’s.

Information heart GPUs can value tens of 1000’s of {dollars} per chip, and cloud corporations often purchase them in massive portions.

AMD’s AI chip enterprise continues to be a lot smaller than Nvidia’s. It mentioned it had $5 billion in AI gross sales in its fiscal 2024, however JP Morgan analysts expect 60% progress within the class this 12 months.

WATCH: AMD CEO Lisa Su: Chip export controls are a headwind however we nonetheless see progress alternative

AMD CEO Lisa Su: Chip export controls are a headwind but we still see growth opportunity

Leave a Reply

Your email address will not be published. Required fields are marked *