HBM3E 12H DRAM Features
As a global leader in cutting-edge memory technology, Samsung Electronics has revealed that it has created HBM3E 12H DRAM , the first 12-stack HBM3E DRAM in the industry and the largest HBM product to date.
The industry-leading 36 gigabytes (GB) of capacity and an unprecedented 1,280 gigabytes per second (GB/s) of bandwidth are offered by Samsung’s HBM3E 12H DRAM. Both elements are more than 50% better than the 8-stack HBM3 8H.
With its innovative 12-layer stack, Samsung’s HBM3E 12H DRAM delivers the biggest capacity HBM in the industry, increasing both performance and capacity by more than 50%. Thermal characteristics and vertical density are improved using cutting-edge TC NCF technology. In the AI age, Samsung positioned itself to satisfy the need for high-performance and high-capacity solutions.
The new HBM3E 12H DRAM product meets the industry’s AI service providers’ growing need for HBM with increased capacity, said Samsung Electronics’ Executive Vice President of Memory Product Planning Yongcheol Bae. As part of the effort to create the fundamental technologies for high-stack HBM and to dominate the high-capacity HBM industry in the AI age, this new memory solution has been developed.
To fulfill the current HBM package specifications, the 12-layer devices may have the same height specification as the 8-layer ones thanks to the use of advanced thermal compression non-conductive film (TC NCF) in the HBM3E 12H DRAM. As the industry works to reduce chip die warping associated with thinner dies, it is expected that the technology will provide further advantages, particularly with greater stacks. Samsung has succeeded in reducing the thickness of their NCF material and has attained the lowest inter-chip gap in the industry, measuring only seven micrometers (µm). Additionally, the company has eliminated any spaces between layers. By comparison with its HBM3 8H product, these efforts lead to an approximately 20% increase in vertical density.
Samsung’s cutting-edge TC NCF makes it possible to use bumps of different sizes between the chips, which enhances the HBM’s thermal characteristics. Smaller bumps are employed in signaling locations and bigger ones are positioned in heat dissipation zones during the chip bonding process. A better product yield is another benefit of this procedure.
Future systems requiring additional memory are anticipated to find the HBM3E 12H DRAM to be the best option as AI applications continue to increase at an exponential rate. Customers will be able to manage their resources more flexibly and lower the total cost of ownership (TCO) of datacenters thanks to its increased performance and capacity. It is anticipated that when utilized in AI applications, the average speed for AI training can be raised by 34% while the number of simultaneous users of inference services can be enlarged more than 11.5 times compared to adopting HBM3 8H.
Samsung has started providing clients with HBM3E 12H DRAM samples, and mass manufacturing is expected to begin in the first part of this year.
FAQS
What is HBM3E 12H DRAM (Samsung 36GB)?
Samsung created this high-capacity, high-performance memory chip. It exclaims:
An industry-first 12-stack design: this permits a higher capacity in comparison to earlier HBM models.
Highest-capacity HBM product available at this time: provides 36GB of storage, which is more than 50% more than the 8-stack model.
An additional 50% gain in bandwidth is possible, with a maximum delivery rate of 1,280 gigabytes per second (GB/s).
What advantages does this DRAM offer?
Because of its higher bandwidth and capacity, it’s perfect for applications that need:
Big data: This applies to data centers, high-performance computing (HPC), and artificial intelligence (AI).
Quick data processing is advantageous for complicated simulations and AI training.
How does it stay the same height as earlier iterations?
Advanced thermal compression non-conductive film (TC NCF) technology is used by Samsung. This preserves compatibility with current HBM packaging standards and permits a 12-layer stack.
What are the ramifications for the future?
Memory technology has advanced significantly with the introduction of the HBM3E 12H DRAM. It might be able to:
Quicken the development of AI by making it possible to handle and train huge datasets more quickly.
Boost HPC performance by allocating the memory bandwidth required for intricate computations.
Encourage innovation in domains that need a lot of data, such the metaverse, driverless cars, and scientific research.
When can I get my hands on this DRAM?
Samsung has not yet provided a precise release date. Its advancement, however, marks a step forward in memory technology and opens the door for further high-performance computer applications.
Samsung 36GB HBM3E 12H DRAM Specifications
Feature | Specification |
---|---|
Memory Type | High-Bandwidth Memory (HBM3E) |
Capacity | 36GB |
Stack Height | 12 layers |
Bandwidth | 1,280 GB/s |
I/O Interface | HBC 4.0 |
Voltage | 1.2V |
0 Comments