Advancing AI 2025: AMD's Latest Products Focused on Openness

0
Jun 19, 2025
  • AMD announced its AI Strategy with new product lines at the Advancing AI 2025 event with the theme of Leadership Compute Engines, Open Ecosystem, Full Stack Solutions.
  • AMD’s new product, the Instinct MI350 series is available now. For broad AI Use cases, MI355X shows over 3x generational inference improvement than the previous product, MI300X.
  • AMD presents AI Rack Solutions for At-Scale Training & Distributed Inference. “Helios”, the optimized AI Rack Solution is available in 2026.
  • AMD emphasized the importance of an ecosystem based on openness and announced investments in developers by releasing the Developer Cloud that supports the ROCm platform and the latest AMD products.

At the Advancing AI 2025 event in San Jose, held on June 12, AMD CEO, Lisa Su, introduced the company’s latest products and ecosystem based on open source software and collaboration. Counterpoint analysts, Jeongku Choi and David Naranjo, were invited by AMD to experience the latest AI strategies and demos from partnership companies. Many partners including Meta, Microsoft, OpenAI and Oracle spoke about their collaborations and achievements with AMD, highlighting the US-based chipmaker’s influence.

At this event, AMD showcased its products that focused on data centers, demonstrating how they connect from the cloud to the edge and to the client. AMD presented the following three AI strategies:

Leadership Compute Engines: AMD has shown confidence in its flagship areas of chipsets, CPUs, and GPUs. The company noted that the EPYC series accounts for about 40% of the Data Center CPU market as of Q1 2025. AMD also showcased its roadmap for GPUs through 2027, showing confidence that it can compete with NVIDIA's product line. This picture shows the AMD Compute portfolio, end-to-end.

Counterpoint Research Best End-to-End AI compute portfolio in the industry

Source: AMD

Open Ecosystem: AMD supports open standards, such as Ultra Ethernet Consortium (UEC), Ultra Accelerator Link (UAL) and the ROCm open-source AI platform.

Full Stack Solution: AMD is investing in full-stack solutions from chipsets to server systems and has acquired and invested in more than 25 companies in 2024 alone.

In addition to Industries, AMD is also actively participating in Sovereign AI. It is collaborating with various companies in the US, Canada, and European countries to contribute to the popularization of AI by governments. The table below shows AMD’s partnership with various countries around the world.

Counterpoint Research - advancing national economies

Source: AMD

Instinct MI350 series

The Instinct MI350 series is AMD's latest GPU, built with 3nm process nodes based on the CDNA4 architecture, with 185 billion transistors and 288GB HBM3e, and more than 3x the model training and inference speed compared to its predecessor MI300X. It has an advantage in memory capacity over its competitor NVIDIA's B200 and supports both Air Cooled and Liquid Cooling methods.

Counterpoint Research - AMD Instinct, MI350 Series

Source: AMD

The next product – the MI 400 series – is set to be released in 2026. It is expected to offer up to 432G HBM4, 40 petaflops of FP4 performance and 300 gigabytes/sec of scale-out bandwidth.

EPYC “Venice” CPU

The EPYC 9005 series, featuring the 5th-generation “Turin”, is used by many customers. The 6th-generation “Venice” is scheduled to be released in Q3 2026. “Venice” is based on the Zen 6 architecture and can be expanded to up to 256 cores, providing 1.7X the performance and 1.6TBs of memory bandwidth compared to its predecessor.

“Verano” is scheduled to be released in 2027 and will form a new infrastructure together with the Instinct MI500 series.

Counterpoint Research - Advancing AI infrastructure on an annual cadence

Source: AMD

Pensando Series

AMD recognizes the importance of networks. It acquired Pensando in 2022 and showcased the Pollara 400 AI NIC and Salina DPU at the Advancing AI 2025 event. As the name suggests, the Polana 400 has 400Gbps throughput, optimized for packet control and error detection, and supports UEC standards. The Salina DPU delivers superior data processing performance in the cloud and AI front-end, with twice the performance over previous generations. Salina DPU is equipped with 64G DDR5 and shows excellent performance in cloud and AI front-end infrastructure with twice the performance over previous generations.

The next product is "Vulcano," which follows UEC and supports both UALink and PCIe for direct communication between CPUs and GPUs. It will be built with a 3nm process and offer 800Gbps processing speed, twice that of the current generation.

The UALink is of an open standard, which can be used on any CPU, any accelerator and any switch. Below is a table comparing the benefits of UALink versus NVIDIA's NVLINK FUSION.

Counterpoint Research - Ultra accelerator link, the truly open standard

Source: AMD

ROCm 7 Platform

When host Anush Elangovan introduced the ROCm7 at the Advancing AI event, the crowd chanted a resounding “Developer! Developer! Developer!,” indicating ROCm’s focus on developer convenience. ROCm is based on open-source software, and supports Linux and Windows. It provides the same development environment from PC to Cloud. To strengthen the ecosystem around ROCm, AMD is collaborating with many companies and supporting many technologies.

AMD ROCm - Deepening ecosystem collaboration

Source: AMD

A developer cloud has been created where you can experience AMD hardware and ROCm platform (http://devcloud.amd.com), and the first 1,500 developers are provided with 25 complimentary GPU hours.

ROCm is also being extended to enterprise AI. It is an end-to-end solution that offers secure data integration and can be deployed easily.

Source: AMD

Rack Scale Solutions

AMD introduced the next-generation AI rack “Helios” based on next-generation AMD compute products, the EPYC "Venice" server CPU, MI 400 GPU, and “Vulcano” network solutions. Scale-up with UALink and scale-out with UEC along with the latest chipsets will allow you to meet more flexible systems in 2026.

Counterpoint Research - AMD Helios, optimized AI rack solution

Source: AMD

Partners’ Comments

  • Meta: “MI300X GPUs are key part of our structure in Llama inference. Optimizing Llama model using AMD GPUs.”
  • Oracle: “AMD Infinity fabric accelerates AI speed between CPU and GPU memories. Many customers of OCI use ROCm regularly.”
  • Humain: “Joint venture with AMD, by 2030, a 100-Gigawatt data center will be built in Saudi Arabia.”
  • Microsoft: “Instinct chips give us great performance, TCO benefits. Open-source models provide really great quality. ROCm is easy to deploy.”
  • Cohere: “Stack on Model is deployed on AMD. For Agentic AI and complex reasoning, there is memory pressure. Higher capacity and strong bandwidth is needed.”
  • Redhat: “Open model is easy, Redhat AI and AMD’s process bring this efficient production-ready AI environment.”
  • AsteralLabs: “UALink is fast and robust, truly open, rack-scale open platform based on ecosystem.”
  • Marvell: “We have evolved UALink from the beginning. UALink is next-generation scale-up fabrics and offers interoperability between GPU and switches for next AI instructions.”
  • OpenAI (Sam Altman): “Extremely excited for the MI450. The memory architecture is great for inference.”


Key Insights from Partners’ Showcases

  • Many rack servers were showcased with AMD. MiTAC and Pegatron showed a 42u rack server, which consists of “Turin” CPU, MI355X with Liquid cooling.
  • Many partners such as Dell and Inventec presented server solutions with AMD.

Source: Counterpoint Research

  • Physical AI is coming. Robots with AMD products were presented by mimik.

Source: Counterpoint Research

  • There were also quite a few consumer-side products at the demo site. Gaming PC with Radeon and a Copilot PC with AMD.

Source: Counterpoint Research

  • This shows how LLM actually works on the ROCm platform on AMD Radeon GPUs.

Source: Counterpoint Research

Analyst Take

  • At this event, unlike at Computex 2025, AMD focused on its new product introductions for data centers with specific call outs for their technologies and specifications versus NVIDIA’s data center solutions
  • AMD highlighted its commitment to open source as its competitive advantage. The company showcased its plans to speed up the cadence of new products each year from customers to data center, based on open source. As AMD tries to gain mind share for its offerings, the open-source narrative seems to be resonating with partners.

Summary

Published

Jun 19, 2025

Author

Team Counterpoint

Counterpoint Research is a global industry and market research firm providing market data, intelligence, thought leadership and consulting across the technology ecosystem. We advise a diverse range of global clients spanning the supply chain – from chipmakers, component suppliers, manufacturers and software and application developers to service providers, channel players and investors. Our veteran team of analysts serve these clients through our offices located across the key innovation hubs, manufacturing clusters and commercial centers globally. Our analysts consistently engage with C-suite through to strategy, market intelligence, supply chain, R&D, product management, marketing, sales and others across the organization. Counterpoint’s key coverage areas: AI, Automotive, Cloud, Connectivity, Consumer Electronics, Displays, eSIM, IoT, Location Platforms, Macroeconomics, Manufacturing, Networks & Infra, Semiconductors, Smartphones and Wearables.