The company unveiled news of HBM3E’s development just seven months ago.
HBM (High Bandwidth Memory) is a high-value, high-performance memory that vertically interconnects multiple DRAM chips and dramatically increases data processing speed in comparison to conventional DRAM products.
HBM3E is the extended version of HBM3 and is the fifth generation of HBM following HBM, HBM2, HBM2E and HBM3
SK hynix said that it expects a successful volume production of HBM3E will help to cement its leadership in the AI memory space and to respond to global big tech companies requiring stronger performance of AI semiconductors. SK hynix expects its HBM3E to have a leading role in this fast -developing market.
The HBM3E can process up to 1.18TB of data per second, equivalent to processing more than 230 full-HD movies (5GB each), in a second.
As AI memory operates at an extremely high speed, controlling heat is another key qualification required for AI memories. SK hynix's HBM3E comes with a 10% improvement in heat-dissipation performance, compared with the previous generation, following application of the advanced MR-MUF process.
MR-MUF (Mass Reflow Molded Underfill) is the process of stacking semiconductor chips and injecting liquid protective materials between them to protect the circuit between chips, and then hardening them. The process has proved to be more efficient and effective for heat dissipation, compared with the method of laying film-type materials for each chip stack.
SK hynix's advanced MR-MUF technology is seen as being critical in securing a stable mass production on the supply side of the HBM ecosystem as pressure on the chips being stacked can be reduced, while warpage control is also improved with adoption of this process.
Sungsoo Ryu, Head of HBM Business at SK hynix, said that mass production of HBM3E has completed the company's lineup of AI memory products.