Menu
Home
Forums
New posts
Search forums
What's new
Featured content
New posts
New media
New media comments
New resources
Latest activity
Media
New media
New comments
Search media
Resources
Latest reviews
Search resources
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Home
Forums
Labrish
Nyuuz
SK hynix rolls out 12-layer HBM4 for faster AI
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
[QUOTE="Nehanda, post: 29095, member: 2262"] SK Hynix has sent samples of its new 12-layer HBM4 memory chips to major customers ahead of all competitors. These super-fast DRAM chips for AI reached customers early because SK Hynix knows what it's doing with both design and manufacturing. The company has already started testing these chips with its customers. They aim to be ready for full production in the second half of this year, which will make them even stronger in the AI memory business. These new 12-layer HBM4 chips have the most storage and fastest speeds of any AI memory you can find. The chips process more than two terabytes of data each second, something no other product can do. Think about watching 400 HD movies all at the same time - that's how much data these chips handle every second! They run more than 60 percent faster than the older HBM3E chips. The company uses what they call Advanced MR-MUF to pack 36 GB into these chips - the most capacity of any 12-layer HBM product. They've already proven this method works great when making earlier chips. This special process keeps the chips flat and helps them stay cool, which makes them work better and last longer. SK Hynix made history as the first company to mass-produce HBM3 back in 2022. Then, they made both 8-layer and 12-layer HBM3E in 2024. They keep leading the AI memory market by creating exactly what customers need right when they need it. Justin Kim, who runs the AI part of SK Hynix, says years of hard work solving tough problems has made them leaders in AI technology. He says they're all set to test these chips and prepare for mass production, using everything they've learned as the biggest HBM maker around. [/QUOTE]
Insert quotes…
Name
Post reply
Home
Forums
Labrish
Nyuuz
SK hynix rolls out 12-layer HBM4 for faster AI
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…
Top