At the Advancing AI 2025 event in San Jose, held on June 12, AMD CEO, Lisa Su, introduced the company’s latest products and ecosystem based on open source software and collaboration. Counterpoint analysts, Jeongku Choi and David Naranjo, were invited by AMD to experience the latest AI strategies and demos from partnership companies. Many partners including Meta, Microsoft, OpenAI and Oracle spoke about their collaborations and achievements with AMD, highlighting the US-based chipmaker’s influence.
At this event, AMD showcased its products that focused on data centers, demonstrating how they connect from the cloud to the edge and to the client. AMD presented the following three AI strategies:
Leadership Compute Engines: AMD has shown confidence in its flagship areas of chipsets, CPUs, and GPUs. The company noted that the EPYC series accounts for about 40% of the Data Center CPU market as of Q1 2025. AMD also showcased its roadmap for GPUs through 2027, showing confidence that it can compete with NVIDIA's product line. This picture shows the AMD Compute portfolio, end-to-end.
Source: AMD
Open Ecosystem: AMD supports open standards, such as Ultra Ethernet Consortium (UEC), Ultra Accelerator Link (UAL) and the ROCm open-source AI platform.
Full Stack Solution: AMD is investing in full-stack solutions from chipsets to server systems and has acquired and invested in more than 25 companies in 2024 alone.
In addition to Industries, AMD is also actively participating in Sovereign AI. It is collaborating with various companies in the US, Canada, and European countries to contribute to the popularization of AI by governments. The table below shows AMD’s partnership with various countries around the world.
Source: AMD
The Instinct MI350 series is AMD's latest GPU, built with 3nm process nodes based on the CDNA4 architecture, with 185 billion transistors and 288GB HBM3e, and more than 3x the model training and inference speed compared to its predecessor MI300X. It has an advantage in memory capacity over its competitor NVIDIA's B200 and supports both Air Cooled and Liquid Cooling methods.
Source: AMD
The next product – the MI 400 series – is set to be released in 2026. It is expected to offer up to 432G HBM4, 40 petaflops of FP4 performance and 300 gigabytes/sec of scale-out bandwidth.
The EPYC 9005 series, featuring the 5th-generation “Turin”, is used by many customers. The 6th-generation “Venice” is scheduled to be released in Q3 2026. “Venice” is based on the Zen 6 architecture and can be expanded to up to 256 cores, providing 1.7X the performance and 1.6TBs of memory bandwidth compared to its predecessor.
“Verano” is scheduled to be released in 2027 and will form a new infrastructure together with the Instinct MI500 series.
Source: AMD
AMD recognizes the importance of networks. It acquired Pensando in 2022 and showcased the Pollara 400 AI NIC and Salina DPU at the Advancing AI 2025 event. As the name suggests, the Polana 400 has 400Gbps throughput, optimized for packet control and error detection, and supports UEC standards. The Salina DPU delivers superior data processing performance in the cloud and AI front-end, with twice the performance over previous generations. Salina DPU is equipped with 64G DDR5 and shows excellent performance in cloud and AI front-end infrastructure with twice the performance over previous generations.
The next product is "Vulcano," which follows UEC and supports both UALink and PCIe for direct communication between CPUs and GPUs. It will be built with a 3nm process and offer 800Gbps processing speed, twice that of the current generation.
The UALink is of an open standard, which can be used on any CPU, any accelerator and any switch. Below is a table comparing the benefits of UALink versus NVIDIA's NVLINK FUSION.
Source: AMD
When host Anush Elangovan introduced the ROCm7 at the Advancing AI event, the crowd chanted a resounding “Developer! Developer! Developer!,” indicating ROCm’s focus on developer convenience. ROCm is based on open-source software, and supports Linux and Windows. It provides the same development environment from PC to Cloud. To strengthen the ecosystem around ROCm, AMD is collaborating with many companies and supporting many technologies.
Source: AMD
A developer cloud has been created where you can experience AMD hardware and ROCm platform (http://devcloud.amd.com), and the first 1,500 developers are provided with 25 complimentary GPU hours.
ROCm is also being extended to enterprise AI. It is an end-to-end solution that offers secure data integration and can be deployed easily.
Source: AMD
AMD introduced the next-generation AI rack “Helios” based on next-generation AMD compute products, the EPYC "Venice" server CPU, MI 400 GPU, and “Vulcano” network solutions. Scale-up with UALink and scale-out with UEC along with the latest chipsets will allow you to meet more flexible systems in 2026.
Source: AMD
Source: Counterpoint Research
Source: Counterpoint Research
Source: Counterpoint Research
Source: Counterpoint Research
Related Research
Jun 18, 2025
Jun 18, 2025
Jun 17, 2025