Saturday, April 4, 2026
ASX 200: 8,412 +0.43% | AUD/USD: 0.638 | RBA: 4.10% | BTC: $87.2K
← Back to home
Technology

SK hynix Files $14B US IPO to End the Memory Chip Crisis Choking AI

The world's HBM shortage has a name: RAMmageddon. South Korea's memory giant just filed for Wall Street's biggest tech listing in five years to fix it.

9 min read
Close-up of semiconductor memory chips on a circuit board
SK hynix plans a $14 billion US IPO to address the AI memory chip shortage.
Editor
Apr 3, 2026 · 9 min read
By Alex Mercer · 2026-03-30

SK hynix just took the most aggressive step yet to fix the memory shortage choking AI infrastructure: a confidential SEC filing for a US stock listing that could raise between $10 billion and $14 billion. AI models need memory, lots of it, and the South Korean chipmaker that supplies the high-bandwidth memory stacks Nvidia glues to its GPUs sees no other way to close the gap.

KEY TAKEAWAYS

01SK hynix filed Form F-1 for a potential $10-14 billion US listing in H2 2026, which would be the biggest tech IPO in the US since 2021
02The company's entire 2026 HBM and DRAM production capacity is sold out, with memory prices surging 50-55% in Q1 2026 alone
03Proceeds will fund $7.9 billion in ASML EUV lithography tools and expansion at facilities in South Korea and Indiana
04CEO Kwak Noh-Jung said SK hynix targets $75 billion in net cash to support long-term AI memory investments

SK hynix announced the Form F-1 submission in a regulatory disclosure on March 25, targeting the second half of 2026 for completion. If it happens, this will be the largest tech IPO in the US since 2021 and potentially the biggest public offering of any kind in five years. The company told regulators it would finalise timing after assessing market conditions and demand.

RAMmageddon has a price tag

Average DRAM prices rose 50% to 55% in the first quarter of 2026 compared to Q4 2025, according to TrendForce. The memory shortage has an unofficial name in the industry: RAMmageddon. Tom Hsu, an analyst at the Taipei-based research firm, told CNBC the speed of the increase was "unprecedented. We haven't seen this kind of velocity in pricing before, even during past undersupply cycles."

SK hynix isn't just short on capacity. The company said in October that its HBM, DRAM, and NAND production for 2026 is "essentially sold out," and it's not alone. Samsung Electronics and Micron face the same constraint, and demand keeps climbing. In December, both SK hynix and Samsung raised HBM3E prices by nearly 20% for 2026 deliveries.

Dell COO Jefferey Clarke told analysts in January the shortage would affect retail device prices despite efforts to rebalance configurations. "The cost increases are real," Clarke said. "I don't see how this will not make its way into the customer base. We'll do everything we can to mitigate that, but there's only so much we can absorb."

Nvidia's Blackwell GPUs use eight HBM3E blocks surrounding each processor die, each block containing multiple stacked DRAM layers. Even the company with the deepest pockets and tightest supplier relationships can't escape the reality: there isn't enough memory to meet demand, and that constraint is now structural.

Why memory is the new bottleneck

HBM chips are stacked vertically and bonded directly to the GPU or AI accelerator using advanced packaging techniques. This design delivers the bandwidth AI models need to shuttle weights and activations between memory and compute cores at speeds traditional DDR memory can't match.

Each Blackwell GPU die is surrounded by memory blocks feeding data through thousands of simultaneous connections. If you can't feed the compute fast enough, performance collapses, no matter how powerful the silicon. Nvidia's architecture makes this dependency obvious.

ASML's extreme ultraviolet (EUV) lithography tools cost hundreds of millions of dollars each and take years to deliver. These aren't commodities you can order off the shelf. Building HBM is harder than building standard DRAM, and the supply chain reflects that reality.

SK hynix just ordered 11.95 trillion won ($7.9 billion) worth of ASML's EUV scanners for delivery by 2027. That order represents the largest single purchase publicly disclosed by an ASML customer. Those tools will go into expanding HBM production at the company's facilities in South Korea and Indiana. No other memory maker has made a commitment this large this fast.

HBM accounted for 23% of total DRAM wafer output in 2026, up from 19% in 2025, according to TrendForce data. Manufacturers pulled forward capacity to chase higher margins, and the result is a crunch in both markets. Every wafer dedicated to HBM is a wafer not making commodity DRAM for PCs, smartphones, or servers.

Where the money goes

SK hynix CEO Kwak Noh-Jung told shareholders on March 25 the company is targeting more than 100 trillion won ($75 billion) in net cash to support long-term strategic investments. That's a huge jump from the 12.7 trillion won ($9.5 billion) the company had at the end of 2025. "Financial capacity will be key to sustaining growth in the AI era," Kwak said at the annual general meeting.

Kwak added that SK hynix "succeeded in developing HBM4 technology and established a proactive mass production base. We continue to lead the industry in next-generation HBM technology, but leadership requires infrastructure." Infrastructure costs billions, and the IPO is the mechanism to fund it.

SK hynix is building a massive semiconductor cluster in Yongin, South Korea, with planned investments of around $400 billion by 2050. IPO proceeds will fund capacity expansion across multiple sites, including the company's M15X fab in Cheongju, where it began equipping the second cleanroom two months ahead of schedule.

SK hynix will use American depositary receipts (ADRs) rather than directly listing shares. SK Square, the company's largest shareholder with a 20.07% stake, is required to hold at least 20% under South Korean holding company rules. Analysts estimate issuing roughly 2% to 3% in new shares could raise $10 billion to $14 billion without diluting SK Square below the threshold.

Samsung might be next

Artisan Partners, a major Samsung shareholder, said publicly after SK hynix's filing that Samsung should consider a similar US listing. The firm told Bloomberg a US listing "could help Samsung boost its valuation, too, as well as give US retail investors a chance to buy its stock." Samsung trades at a discount to global peers, partly due to limited access for US investors.

Samsung following suit would flood the industry with new capital for memory manufacturing over the next two to three years. Memory fabs take at least two years to build, assuming no delays with equipment delivery or regulatory approvals. New capacity wouldn't come online until 2028 at the earliest.

Micron told investors memory tightness could persist beyond 2026, driven largely by AI demand. Stacy Rasgon, an analyst at Bernstein, pointed to memory as one of the tight parts of the chip supply chain in an interview with Tom's Hardware in January. "Nobody's scaling up," Rasgon said. "The industry learned its lesson from past oversupply cycles. You're not seeing capacity expansion ahead of demand anymore."

AI infrastructure needs more than GPUs

Memory is the pipe that connects compute to actual performance, and the pipe is too small. Nvidia has all the compute in the world, but if you can't feed it data fast enough, performance tanks. That's where the industry is today. Most people still underestimate how structural this constraint has become.

Nvidia CEO Jensen Huang said in early 2025 that advanced packaging capacity had quadrupled in under two years but was still a bottleneck. TSMC, which provides Nvidia's CoWoS (Chip-on-Wafer-on-Substrate) packaging, is scaling from around 75,000 wafers per month in 2025 to an estimated 120,000 to 130,000 wafers per month by the end of 2026. That's 60% growth in a single year.

TrendForce projects capacity will remain tight through most of 2026 before new fabs start coming online in 2027 and 2028. That's massive growth, but it's still playing catch-up to demand. A decision made today translates to capacity in 2028. SK hynix's IPO is a bet that this demand is structural, not cyclical. That bet looks increasingly solid.

SK hynix is saying AI isn't a bubble, memory will remain tight for years, and builders need billions of dollars now to construct the capacity the industry will need in 2028 and beyond. Wall Street woke up to the fact that the real money in AI isn't in models or apps but in the unglamorous infrastructure that makes them possible. Whether Samsung and Micron follow SK hynix to US markets will tell you how serious they think the shortage is. At least one of them will file within 18 months.

TLDR

SK hynix confidentially filed for a US stock listing targeting $10-14 billion, aiming to fund massive capacity expansion for high-bandwidth memory (HBM) chips used in AI processors. Memory shortages have driven sharp price spikes (DRAM jumped 50-55% this quarter alone). With 2026 production already sold out and competitors Samsung and Micron facing the same crunch, the IPO could fund the infrastructure AI actually needs.

FREQUENTLY ASKED QUESTIONS

What is RAMmageddon?
RAMmageddon is the informal industry term for the current global memory chip shortage. It refers to the supply crunch in DRAM and HBM (high-bandwidth memory) caused by surging AI demand. Memory prices jumped 50-55% in Q1 2026 alone, and major manufacturers like SK hynix, Samsung, and Micron have sold out their 2026 production capacity.
Why is HBM different from regular RAM?
HBM (high-bandwidth memory) is stacked vertically and bonded directly to GPUs or AI processors using advanced packaging, delivering far higher bandwidth than traditional DDR memory. It's essential for AI accelerators because it can shuttle data between memory and compute cores fast enough to keep up with parallel processing workloads. Building HBM requires expensive EUV lithography tools and complex manufacturing processes, which is why it's in short supply.
How big is SK hynix's planned US IPO?
SK hynix confidentially filed Form F-1 with the SEC for a potential US listing that could raise between $10 billion and $14 billion in the second half of 2026. If completed, it would be the largest tech IPO in the US since 2021 and potentially the biggest public offering of any kind in five years.
Will the IPO actually fix the memory shortage?
Not immediately. Memory fabs take at least two years to build, so even with the IPO proceeds invested today, new capacity wouldn't come online until 2028 at the earliest. However, if the IPO succeeds and Samsung follows with its own US listing, the industry could see new capital for long-term capacity expansion. The shortage will likely persist through 2026 and into 2027 before new supply starts to ease the crunch.
Editor

Editor

The Bushletter editorial team. Independent business journalism covering markets, technology, policy, and culture.

The Morning Brief

Business news that matters. Five stories, five minutes, delivered every weekday. Trusted by professionals who need clarity before the market opens.

Free. No spam. Unsubscribe anytime.