SK hynix begins volume production of 12-Layer HBM3E

1 min read

SK hynix has begun mass production of the world's first 12-layer HBM3E product with 36GB, the largest capacity of existing HBM to date.

Credit: SK hynix

According to the company, it plans to supply mass-produced products to customers within the year, just six months after delivering the HBM3E 8-layer product to customers for the first time.

SK hynix is currently the only company in the world that has developed and supplied the entire HBM lineup from the first generation (HBM1) to the fifth generation (HBM3E), since releasing the world's first HBM in 2013, and it said that to plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by mass-producing the 12-layer HBM3E.

The company said that the 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory including speed, capacity and stability. SK hynix has increased the speed of memory operations to 9.6 Gbps, the highest memory speed currently available.

For example, if the Llama 3 70B', a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it’s capable of reading 70 billion total parameters 35 times within a second.

SK hynix has increased the capacity by 50% by stacking 12 layers of 3GB DRAM chips at the same thickness as the previous eight-layer product. To achieve this, the company made each DRAM chip 40% thinner than before and stacked vertically using TSV technology.

The company also solved structural issues that arise from stacking thinner chips higher by applying its core technology, the Advanced MR-MUF process. This allows it to provide 10% higher heat dissipation performance compared to the previous generation and secures the stability and reliability of the product through enhanced warpage controlling.