25-03-2025 | Micron | Industrial
Micron Technology, Inc. has announced it is the world's first and only memory company shipping HBM3E and SOCAMM products for AI servers in the data centre. This extends its industry leadership in designing and delivering LPDDR for data centre applications.
The company's SOCAMM, a modular LPDDR5X memory solution, was developed with NVIDIA to support the NVIDIA GB300 Grace Blackwell Ultra Superchip. The Micron HBM3E 12H 36GB is also designed into the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24GB is available for the NVIDIA HGX B200 and GB200 NVL72 platforms. The deployment of the HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems underscores the company's critical role in accelerating AI workloads.
The company's broad portfolio includes HBM3E 8H 24GB and HBM3E 12H 36GB, LPDDR5X SOCAMMs, GDDR7 and high-capacity DDR5 RDIMMs and MRDIMMs. Also, it offers an industry-leading portfolio of data centre SSDs and automotive and industrial products such as UFS4.1, NVMe SSDs and LPDDR5X, all of which are ideal for edge compute applications.
"AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications," said Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit. "HBM and LP memory solutions help unlock improved computational capabilities for GPUs."
The modular SOCAMM solution allows accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to deliver high-capacity memory for increasing AI workload requirements.
The company's SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution, designed to satisfy the demands of AI servers and data-intensive applications. This new SOCAMM solution allows data centres to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to supply infrastructure flexibility.
The company continues its competitive lead in the AI industry by delivering 50% increased capacity over the HBM3E 8H 24GB within the same cube form factor. Also, the HBM3E12H 36GB provides up to 20% lower power consumption than the competition's HBM3E 8H 24GB offering while delivering 50% higher memory capacity.
By continuing to deliver outstanding power and performance metrics, the company aims to maintain its technology momentum as a leading AI memory solutions provider through the launch of HBM4. Its HBM4 solution is expected to boost performance by over 50% compared to HBM3E.
The company also has a proven portfolio of storage products developed to meet the growing demands of AI workloads. Advancing storage technology in performance and power efficiency at the speed of light needs tight collaboration with ecosystem partners to ensure interoperability and a seamless customer experience. It delivers optimised SSDs for AI workloads such as inference, training, data preparation, analytics and data lakes.