Welcome! Sign in | Join free
Quote Call: +281-899-8096
Home > News > > The memory chip industry's profit margin surpasses that of the semiconductor foundry industry

News

The memory chip industry's profit margin surpasses that of the semiconductor foundry industry

Sunday,Dec 28,2025

 With the rapid rise in demand for artificial intelligence, memory chip prices have surged, leading to a shift in the profit structure of the semiconductor industry. According to the *Korea Economic Daily*, Samsung Electronics' memory division and SK Hynix are expected to surpass TSMC's gross margin in the fourth quarter of 2025. This would be the first time since the fourth quarter of 2018 that the memory chip industry's profit margin has exceeded that of the semiconductor foundry industry.

 
The report indicates that Samsung Electronics and SK Hynix's gross margin is projected to be between 63% and 67%, higher than TSMC's projected 60%. Furthermore, Micron, the world's third-largest memory chip manufacturer, already achieved a gross margin of 56% in the first fiscal quarter of fiscal year 2026 (September to November 2025) and expects it to further increase to 67% in the second fiscal quarter (December 2025 to February 2026), indicating that Micron also has the potential to surpass TSMC's profit performance in the first quarter of calendar year 2026.
 
The rapid rise in memory chip prices is a major driver of profit growth in the memory industry. Currently, the three major memory chip manufacturers have allocated approximately 18% to 28% of their DRAM production capacity to high-bandwidth memory (HBM). HBM is manufactured by stacking 8-16 DRAM chips, which has significantly compressed the supply of general-purpose DRAM, leading to a single-quarter price increase of over 30%.
 
The report points out that as the artificial intelligence industry shifts from "training" to "inference," rapid data storage and retrieval are crucial. Inference applies the knowledge gained during training to problem-solving, which in turn requires memory such as HBM to store data and continuously feed it to GPUs. This increased demand for memory chips is driving memory chip gross margins to surpass those of wafer foundries.
 
Furthermore, although general-purpose DRAM lags behind HBM in performance, in the early stages of AI's shift to inference, workloads are typically handled by general-purpose DRAM (such as GDDR7 and LPDDR5X), while HBM is reserved for more intensive inference tasks. For example, Nvidia's use of GDDR7 in its inference-focused AI accelerators is a typical example.
 
Meanwhile, memory chip manufacturers plan to sustain the memory-centric era by developing high-performance products tailored for artificial intelligence. One example is Partition In-Memory (PIM), which enables memory to handle some of the computational workloads traditionally performed by GPUs. The report adds that technologies such as Vertical Channel Transistor (VCT) DRAM and 3D DRAM are also expected to enter the market, increasing data density by storing more information in a smaller area.

Tags:

Comments

Name