SK Group Warns Memory Shortage Could Last to 2030, Raising Stakes for AI Growth

SK Group chairman Chey Tae-won warned at NVIDIA’s GTC that global shortages of memory chips—especially HBM used in AI accelerators—could persist until 2030. He cited systemic production bottlenecks and rising AI demand that will likely keep DRAM, NAND and HBM prices elevated and prompt further investment and strategic moves by chipmakers.

A close-up of a vintage motherboard highlighting microchips and electronic components.

Key Takeaways

  • 1SK Group chairman Chey Tae-won told NVIDIA GTC attendees the memory chip shortage could last until 2030 due to systemic production bottlenecks.
  • 2AI demand has pushed shortage rates for some AI‑focused storage chips above 30%, with HBM singled out as a key chokepoint.
  • 3SK Hynix anticipates needing four to five years to meaningfully expand wafer capacity and is considering a U.S. ADR to broaden investors.
  • 4Sustained higher prices for DRAM, NAND and HBM would raise cloud and data‑centre costs and could force industry and policy adjustments.
  • 5NVIDIA’s bullish chip demand forecasts and new product announcements helped lift Korean chip stocks, underscoring tight AI‑compute dynamics.

Editor's
Desk

Strategic Analysis

Chey’s 2030 timeline should be read less as a precise forecast and more as a strategic signal: memory shortages are not a short, cyclical hiccup but a structural issue tied to the accelerated pace of AI demand and the long lead times of semiconductor capital projects. That shifts the policy and corporate calculus. Firms will pour capital into fabs, advanced packaging and supply‑chain partnerships, but returns will be slow and uneven, favouring incumbents with capital and geopolitical reach. For governments, the episode validates industrial policy interventions to secure advanced memory supply and to incentivise alternative technology paths that reduce pressure on scarce HBM resources. For buyers of compute, it creates urgency to optimise memory use, diversify suppliers and consider architectural workarounds that mitigate exposure to a single scarce component.

NewsWeb Editorial
Strategic Insight
NewsWeb

Speaking at NVIDIA’s GTC conference in San Jose on March 17, SK Group chairman Chey Tae-won delivered a stark prognosis for the semiconductor industry: the global shortage of memory chips—particularly the high-bandwidth memory (HBM) used by artificial-intelligence accelerators—could persist until 2030. He attributed the problem to systemic production bottlenecks and said market demand from AI workloads has pushed shortages above 30% for certain AI-focused storage chips. Chey warned that DRAM, NAND and HBM prices are likely to rise and remain elevated for an extended period as wafer capacity and upstream supply chains struggle to catch up.

The comments echoed remarks Chey made in Washington last month and reinforced a familiar theme for memory makers: surging AI demand is colliding with long lead times for new fab capacity. SK Hynix, one of the world’s largest memory manufacturers and a key supplier of HBM to NVIDIA, told investors it will need at least four to five years to expand wafer output sufficiently to relieve the squeeze. The company is reportedly weighing a U.S. ADR listing to broaden its investor base and may announce measures to stabilise DRAM prices in the near term.

NVIDIA’s presentations at GTC amplified the dynamic. CEO Jensen Huang forecast enormous demand for next‑generation AI chips, a message that helped lift South Korean semiconductor stocks; Samsung and SK Hynix shares rallied after Huang revealed a new Grok3 LPU chip produced by Samsung and reiterated bullish long‑term AI demand projections. That bullish outlook for AI compute capacity helps explain why HBM—a small, specialised but indispensable component for accelerators—has become a chokepoint: HBM stacks are complex, require advanced packaging and rely on constrained wafer, foundry and test resources.

The practical implications of a prolonged memory squeeze are wide. Higher DRAM/NAND prices would raise costs for cloud providers and data centres, translating into more expensive training and serving of large AI models. Industries from smartphones to enterprise storage may face delayed product cycles or higher prices as memory is diverted to high‑value AI customers. Politically, the strain underscores why governments and companies are racing to secure domestic or allied supply chains for advanced semiconductors and the equipment and chemicals that feed them.

Several structural factors explain why relief would be slow. Building new wafer fabrication and advanced packaging capacity takes years and requires massive capital investment, specialised equipment (much of it produced by a handful of firms), and a skilled workforce. The HBM supply chain is also congested by capacity constraints at back‑end assembly, thermal interface materials and through‑silicon via processes that cannot be scaled overnight. Even with sizeable capex commitments, companies face long lead times before new capacity produces the tight, high‑yield memory required by AI accelerators.

For investors and policy makers, Chey’s forecast is both a warning and a signal. For memory vendors, the prospect of sustained high prices justifies rapid investment and strategic partnerships; for customers, it suggests the need to diversify procurement and consider architectural changes—such as more efficient memory usage or alternative memory hierarchies—to reduce dependence on scarce HBM. For governments, the shortage strengthens the case for subsidies, export controls and collaboration to shore up critical nodes in the semiconductor supply chain.

In the near term, expect price volatility and bidding wars for production slots. In the medium term, market shares could shift toward firms that successfully scale HBM capacity or pursue vertical integration. And in the longer run, persistent shortages would accelerate strategic competition over chip production capacity and could reshape how AI systems are architected to be more memory‑efficient.

Share Article

Related Articles

📰
No related articles found