Why the AI bull market’s obsession shifted away from Nvidia to memory chip makers
Chipmaker stocks have exploded over the past month, with Micron up 80%, SanDisk up 52%, and Intel up 85%, to name just a few companies participating in the rally. Behind the surge is an evolving systems architecture for AI known as “orchestration” in which workloads are distributed through multiple processing channels rather than concentrated in bigger, more centralized blocks. Orchestration requires a greater number of more traditional central processing units (CPUs) relative to beefier graphics processing units (GPUs), which drove Nvidia’s rise during the first phase of the AI buildout. While GPUs will remain essential for core AI tasks like model training and query response, Wall Street thinks the orchestration-driven proportions in chip demand are set to continue as AI software becomes “agentic” – that is, better at handling more generalized instructions. MU NVDA 1Y mountain Micron vs. Nvidia, 1 year “We believe agentic AI will increase the CPU-to-GPU mix in AI systems by adding more orchestration, memory, and tool-use work,” Morgan Stanley analyst Shawn Kim and colleagues wrote in a Monday note to investors. “This should not reduce GPU demand, but it does increase overall system complexity and shifts incremental infrastructure spend toward CPUs, networking, and memory.” The new buzzword Tech companies are saying similar things about orchestration, emphasizing coordination and adaptability within their infrastructure rather than chip architecture itself as the way to boost computing capacity for AI. “No single chip architecture can efficiently serve every workload,” Meta said in an April statement announcing its usage/rental of “tens of millions” of Graviton CPUs from Amazon’s cloud infrastructure subsidiary. “As Meta advances its work with agentic AI, compute requirements are evolving to demand more CPU.” Chipmaker AMD also stressed CPUs in the context of orchestration as part of a deal announced with Meta in February. The $60 billion deal stipulated that Meta would buy six gigawatts’ worth of chips over five years while allowing it to purchase up to 10% of AMD, according to news agency Reuters . “As AI infrastructure grows in scale and complexity, CPUs are a strategic pillar of the AI compute stack, enabling efficiency, scalability and orchestration alongside GPUs,” AMD said in a statement. The cybersecurity link Orchestration is already proving itself as a way to boost AI capabilities in a cost-effective way. The launch of Anthropic’s Mythos sent shockwaves through the cybersecurity world last month, prompting the company to limit access to it, but multiple research organizations say they’ve been able to reproduce its results by orchestrating less-advanced publicly available models. “We used GPT-5.4 and Claude Opus 4.6 in opencode, together with a standardized chunked security-review workflow, and tried to reproduce Anthropic’s patched public examples outside Anthropic’s internal stack,” researchers at Vidoc Security Lab said , describing their results as “more useful.” “The takeaway is not whether Mythos is better or more powerful. It is that public models can already achieve much the same results,” they said. Cybersecurity firm Aisle said it did the same, using smaller, cheaper and coordinated models to isolate the same security bugs. “The small models already provide sufficient uplift that, wrapped in expert orchestration … produce results that the ecosystem takes seriously,” the firm wrote. One industry consultant told CNBC that equating AI computing power with GPUs amounted to a “misunderstanding” in the market. “The misunderstanding in the marketplace is that if you’re doing AI, it has to be running on GPUs. That’s not the case. I don’t know why people started to believe that – maybe Nvidia and marketing and things like that,” former chief cloud officer for Deloitte David Linthicum said. “When I’m training architects, I always tell them to use the minimum viable technology, and that’s going to be CPUs as much as possible.” Other beneficiaries Other parts of the data center supply chain are benefiting from the pivot to orchestration as well, specifically the interstitial segments that connect different processing channels. These are sectors like electronic design automation, baseboard management control, and substrates, along with memory systems like DRAM and NAND. Some downstream standouts mentioned in the May 11 Morgan Stanley note on “agentic” AI include KLA Corp, Cadence Design Systems, and Taiwan-based Gold Circuit Electronics, along with the big memory makers – Samsung, SK Hynix, Micron , SanDisk , Kioxia.
About the Author
Related
Discover more from InfoVera USA
Subscribe to get the latest posts sent to your email.