Memory Innovation Crucial to Success of Agentic AI Era

0
Jun 10, 2025
  • Rapid advancements in agentic AI capabilities of smartphones are transforming devices into intelligent, context-aware and increasingly autonomous agents
  • Advanced AI workloads place unprecedented strain on smartphone hardware with memory systems emerging as critical bottlenecks — current flagship DRAM capacities (12GB+) are baseline with requirements to reach 32GB or more
  • Full agentic AI on the smartphone requires robust on-device processing for lower latency, enhanced user privacy, cost efficiency and offline functionality
  • Delivering on these demands requires concerted, ecosystem-wide collaboration and innovation — solutions like LPDDR6, processing-in-memory (PIM), Wide I/O, advanced packaging techniques and other approaches are crucial
  • Micron and others are developing these technologies, but no entity can succeed alone — robust standardization (JEDEC) and strategic investments including government initiatives are essential to usher in the intelligent, agentic smartphone era
  • Download the Whitepaper “Memory At The Edge: Powering GenAI’s Next Frontier” to understand the full context around memory and how it will enable delivery of truly agentic AI on the smartphone


GenAI Smartphone Evolution: From You+ to You²

Today’s GenAI smartphones are evolving into more intelligent, contextually aware devices as increasing agentic AI capabilities shift them from simple communication hubs to truly intelligent, autonomous companions.

Agentic AI Task Orchestration Across Apps

Organizational chart of agentic AI task orchestration across apps. The user is at the top; it commands the AI orchestrator agent, which communicates with the smartphone OEM’s agent and third-party agents such as email, calendar, booking, and other apps. Source: Counterpoint Research AI 360 Service.

Source: Counterpoint Research AI 360 Service

Powering Personalization

The biggest challenge in making agentic AI possible on smartphones lies at the hardware level: powering an ever-growing universe of AI capabilities within the strict limits of battery life, processing power, and memory.

Having the core functionalities of agentic AI — real-time responsiveness, deep personalization and proactive assistance — sit at the edge is an imperative due to critical requirements like low latency, enhanced privacy, cost and bandwidth efficiency, offline access and personalization.

Agentic AI will raise smartphones to a whole new category, enabling them to become proactive digital companions capable of complex intent recognition and real-time adaptation,” says Neil Shah, Counterpoint VP of research. “This means devices will have full contextual understanding and move beyond today’s “you+” paradigm where AI simply assists, to a “you²” model where AI is a digital extension of you powered by multiple personalized learning models running at the edge.”

As a result, a major leap in the capabilities of smartphone hardware components will be necessary for agentic AI to deliver on its promise handling relevant AI workloads at the edge. Processors (SoC), memory, storage, battery, sensors and interconnects, and thermal management will all need significant upgrades from today’s topline capabilities.

Typical Hardware Components for Today’s GenAI Smartphone

Conceptual image of the inside of a GenAI smartphone. Hardware includes heat dissipation, storage, memory, camera, compute, sensors, and a battery. Source: Counterpoint Research AI 360 Service.

Source: Counterpoint Research AI 360 Service

In particular, requirements on memory subsystems are rising fast with the ever increasing need to supply data quickly and efficiently for advanced on-device AI.

“Growth in memory bandwidth is being outpaced by compute performance and we’re approaching the point where simply adding more conventional DRAM is not a viable long-term solution,” observes Christopher Moore, VP of marketing at Micron. “Architectural innovation is absolutely essential now.”

Memory at the Speed of AI

Several key technologies and industry innovations need to ensure memory solutions dovetail with the escalating needs of the GenAI workloads at the edge:

1. Advanced LPDDR (LPDDR5X, LPDDR6)

  • LPDDR5X: Current standard, offering speeds up to ~10.7 Gbps

  • LPDDR6: JEDEC’s upcoming standard promises even faster bandwidth (14.4 Gbps+) and more power efficiency — critical for sustaining AI performance.

LPDDR RAM Speed Evolution (Gbps)

Column chart of the speed evolution for LPDDR RAM. Source: Counterpoint Research Memory Service.

Source: Counterpoint Research Memory Service

2. Processing-In-Memory (PIM) Architecture

PIM fundamentally challenges current Von Neumann architectures and integrates compute functions directly into memory, slashing latency and power use. While standardization and ecosystem support are still evolving, PIM offers strong potential for accelerating select AI tasks.

Von Neumann Architecture vs. PIM Architecture

Von Neumann Architecture’s CPU/GPU and DRAM compared to PIM Architecture’s CPU/GPU and DRAM. There are many data movements between Von Neumann’s CPU/GPU and DRAM, whereas PIM has fewer data movements. Source: Micron.

Source: Micron

3. Wide I/O Interface and Advanced Packaging

Expanding memory-to-SoC data pathways via Wide I/O boosts bandwidth, often using advanced packaging like 3D stacking. These methods also aid thermal management and may allow OEMs to offload DRAM into separate packages to better serve AI-heavy workloads.

Current Architecture vs. Wide I/O Interface

Current Architecture’s CPU/GPU and DRAM compared to Wide I/O’s Architecture’s CPU/GPU and DRAM. There are constrained movements between the current architecture’s CPU/GPU and DRAM, whereas Wide I/O has broad data movements. Source: Micron.

Source: Micron

Beyond Hardware: Quantization and Small Language Models

Beyond hardware advancements, techniques like quantization are critical for bringing GenAI to smartphones by cutting memory and compute demands by reducing model precision — all while preserving accuracy.

Combined with compact, efficient, small language models (SLMs), these innovations facilitate strong AI performance on-device, accelerating the shift toward intelligent, low-power edge applications.

Next Steps: Industry Action and a Call for Collaboration

Intelligence is only as powerful as the systems behind it and memory performance is the core enabler of agentic AI on mobile. Meeting AI’s growing memory demands will require unified industry efforts including:

  • Deep collaboration among SoC designers, memory & storage vendors, OEMs, OS developers, and AI researchers to co-optimize hardware and software for edge AI.

  • Accelerated standardization by bodies like JEDEC for technologies such as LPDDR6 and future packaging interfaces, ensuring interoperability and innovation.

  • Shared vision and investment in next-generation memory, storage, and packaging technologies are crucial to keep pace with AI’s rapid growth and unlock fully autonomous mobile intelligence.

This is more than a race for faster processors; it's a call to action for the industry to unlock the transformative potential of agentic AI, elevating smartphones into truly intelligent partners that enrich our lives.

The future of mobile is not just smart — it is autonomous. And it starts now.

Download the Whitepaper “Memory At The Edge: Powering GenAI’s Next Frontier” to understand the full context around memory and how it will enable delivery of agentic AI on the smartphone.

Summary

Published

Jun 10, 2025

Author

Team Counterpoint

Counterpoint Research is a global industry and market research firm providing market data, intelligence, thought leadership and consulting across the technology ecosystem. We advise a diverse range of global clients spanning the supply chain – from chipmakers, component suppliers, manufacturers and software and application developers to service providers, channel players and investors. Our veteran team of analysts serve these clients through our offices located across the key innovation hubs, manufacturing clusters and commercial centers globally. Our analysts consistently engage with C-suite through to strategy, market intelligence, supply chain, R&D, product management, marketing, sales and others across the organization. Counterpoint’s key coverage areas: AI, Automotive, Cloud, Connectivity, Consumer Electronics, Displays, eSIM, IoT, Location Platforms, Macroeconomics, Manufacturing, Networks & Infra, Semiconductors, Smartphones and Wearables.