SK Hynix has sent samples of its new 12-layer HBM4 memory chips to major customers ahead of all competitors. These super-fast DRAM chips for AI reached customers early because SK Hynix knows what it's doing with both design and manufacturing. The company has already started testing these chips with its customers. They aim to be ready for full production in the second half of this year, which will make them even stronger in the AI memory business.
These new 12-layer HBM4 chips have the most storage and fastest speeds of any AI memory you can find. The chips process more than two terabytes of data each second, something no other product can do. Think about watching 400 HD movies all at the same time - that's how much data these chips handle every second! They run more than 60 percent faster than the older HBM3E chips.
The company uses what they call Advanced MR-MUF to pack 36 GB into these chips - the most capacity of any 12-layer HBM product. They've already proven this method works great when making earlier chips. This special process keeps the chips flat and helps them stay cool, which makes them work better and last longer.
SK Hynix made history as the first company to mass-produce HBM3 back in 2022. Then, they made both 8-layer and 12-layer HBM3E in 2024. They keep leading the AI memory market by creating exactly what customers need right when they need it. Justin Kim, who runs the AI part of SK Hynix, says years of hard work solving tough problems has made them leaders in AI technology. He says they're all set to test these chips and prepare for mass production, using everything they've learned as the biggest HBM maker around.
These new 12-layer HBM4 chips have the most storage and fastest speeds of any AI memory you can find. The chips process more than two terabytes of data each second, something no other product can do. Think about watching 400 HD movies all at the same time - that's how much data these chips handle every second! They run more than 60 percent faster than the older HBM3E chips.
The company uses what they call Advanced MR-MUF to pack 36 GB into these chips - the most capacity of any 12-layer HBM product. They've already proven this method works great when making earlier chips. This special process keeps the chips flat and helps them stay cool, which makes them work better and last longer.
SK Hynix made history as the first company to mass-produce HBM3 back in 2022. Then, they made both 8-layer and 12-layer HBM3E in 2024. They keep leading the AI memory market by creating exactly what customers need right when they need it. Justin Kim, who runs the AI part of SK Hynix, says years of hard work solving tough problems has made them leaders in AI technology. He says they're all set to test these chips and prepare for mass production, using everything they've learned as the biggest HBM maker around.