News

Ray Wang of Futurum says SK Hynix will be able to hold on to its lead in high bandwidth memory chip technology despite ...
Sandisk has appointed two leading figures in computing to help shape the direction of its high-capacity memory tech for AI ...
Samsung Electronics announced during a conference call on the 31st that "sales of high bandwidth memory (HBM) increased by ...
Enfabrica, a Silicon Valley-based chip startup working on solving bottlenecks in artificial intelligence data centers, on ...
High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface.
High-Bandwidth Memory Chips Market is Segmented by Type (HBM2, HBM2E, HBM3, HBM3E, Others), by Application (Servers, Networking Products, Consumer Products, Others): Global Opportunity Analysis and ...
Generative AI is arguably the most complex application that humankind has ever created, and the math behind it is incredibly ...
Samsung Electronics is reportedly pushing back the mass production of its next-gen high-bandwidth memory (HBM) chips to 2026, ...
Enfabrica, a Silicon Valley-based startup backed by Nvidia, has unveiled a breakthrough product that may significantly ...
Revenue rose about 35% in the June quarter compared with the same period a year earlier, while operating profit rose 68%, ...
High-bandwidth memory, or HBM, is poised for a breakthrough year in 2026 as AI’s compute-hungry needs continues to reshape the memory landscape, UBS analysts said in a recent note.
It began shipping its next-generation HBM4 memory in early June 2025, delivering 36 GB, 12-high HBM4 samples to important customers, reportedly including Nvidia.