Micron Technology has surged 140% in 2025 despite exiting its AI server memory chip business in China, driven by explosive demand for high-bandwidth memory (HBM) chips used in artificial intelligence applications. The memory chipmaker trades at just 24 times trailing earnings—significantly cheaper than the broader tech market—while analysts project its earnings could double in fiscal 2026, potentially setting up the stock for another 170% gain.
What you should know: Micron delivered exceptional financial performance in fiscal 2025, with revenue jumping 49% to $37.4 billion and adjusted earnings per share soaring from $1.30 to $8.29.
- The company’s cloud memory business unit, which sells HBM chips, saw revenue increase 3.5 times to $13.5 billion as AI server demand exploded.
- Operating income margins expanded nearly fourfold as tight supply and healthy demand for HBM chips boosted pricing power.
- Despite the strong rally, Micron trades at an attractive valuation of 24 times trailing earnings compared to the Nasdaq-100’s 33 times multiple.
The China situation: Micron plans to stop selling AI server memory chips to Chinese customers, redirecting that capacity to other markets where demand remains robust.
- China represented just $2.6 billion in revenue (7% of total sales) in fiscal 2025, primarily from data center operations that Beijing had already restricted since 2023.
- The company will continue selling memory chips for automotive, smartphone, and PC applications in China, plus data center chips to Chinese companies operating outside the country like Lenovo.
- Management expects “to conclude agreements to sell out the remainder of our total HBM calendar 2026 supply in the coming months,” indicating strong demand from other customers.
Why HBM demand is exploding: High-bandwidth memory has become essential for AI workloads, creating a supply-constrained market with multiple growth drivers.
- Goldman Sachs predicts 23% growth in GPU-related HBM demand next year, plus an 82% surge from custom AI processors.
- Micron now supplies HBM to six customers, likely including major AI chipmakers like Nvidia, Broadcom, and AMD who are receiving massive data center orders.
- Industry analysts project $4 trillion in AI infrastructure spending by the end of the decade, ensuring long-term demand for memory chips.
In plain English: High-bandwidth memory (HBM) is like super-fast data storage that sits right next to AI processors, allowing them to quickly access the massive amounts of information needed for artificial intelligence tasks. Think of it as the difference between having your reference books on your desk versus having to walk to the library every time you need to look something up.
The earnings growth story: Wall Street analysts expect Micron’s bottom line to double in fiscal 2026, potentially driving significant stock price appreciation.
- If the company hits projected earnings of $16.68 per share and trades at the Nasdaq-100’s current valuation multiple, the stock price could reach $550.
- That scenario would represent 170% upside from current levels around $219, rewarding investors who buy before the next leg higher.
- The combination of explosive earnings growth and attractive valuation makes Micron an outlier among AI stocks that are typically either unprofitable or expensive.
Undervalued and Profitable: This Magnificent Artificial Intelligence (AI) Stock Can Soar Higher After Skyrocketing 140% in 2025