HBM3
   HOME

TheInfoList



OR:

High Bandwidth Memory (HBM) is a
computer memory Computer memory stores information, such as data and programs, for immediate use in the computer. The term ''memory'' is often synonymous with the terms ''RAM,'' ''main memory,'' or ''primary storage.'' Archaic synonyms for main memory include ...
interface for 3D-stacked
synchronous dynamic random-access memory Synchronous dynamic random-access memory (synchronous dynamic RAM or SDRAM) is any DRAM where the operation of its external pin interface is coordinated by an externally supplied clock signal. DRAM integrated circuits (ICs) produced from the ...
(SDRAM) initially from
Samsung Samsung Group (; stylised as SΛMSUNG) is a South Korean Multinational corporation, multinational manufacturing Conglomerate (company), conglomerate headquartered in the Samsung Town office complex in Seoul. The group consists of numerous a ...
,
AMD Advanced Micro Devices, Inc. (AMD) is an American multinational corporation and technology company headquartered in Santa Clara, California and maintains significant operations in Austin, Texas. AMD is a hardware and fabless company that de ...
and
SK Hynix SK Hynix Inc. () is a South Korean supplier of dynamic random-access memory (DRAM) chips and flash memory chips. SK Hynix is one of the world's largest semiconductor vendors. Founded as Hyundai Electronics in 1983, SK Hynix was integrated into ...
. It is used in conjunction with high-performance graphics accelerators, network devices, high-performance datacenter AI
ASIC An application-specific integrated circuit (ASIC ) is an integrated circuit (IC) chip customized for a particular use, rather than intended for general-purpose use, such as a chip designed to run in a digital voice recorder or a high-efficien ...
s, as on-package cache in
CPU A central processing unit (CPU), also called a central processor, main processor, or just processor, is the primary processor in a given computer. Its electronic circuitry executes instructions of a computer program, such as arithmetic, log ...
s and on-package RAM in upcoming CPUs, and
FPGA A field-programmable gate array (FPGA) is a type of configurable integrated circuit that can be repeatedly programmed after manufacturing. FPGAs are a subset of logic devices referred to as programmable logic devices (PLDs). They consist of a ...
s and in some supercomputers (such as the NEC SX-Aurora TSUBASA and
Fujitsu A64FX The A64FX is a 64-bit ARM architecture microprocessor designed by Fujitsu. The processor is replacing the SPARC64 V as Fujitsu's processor for supercomputer applications. It powers the Fugaku supercomputer, ranked in the TOP500 as the fastest ...
). The first HBM memory chip was produced by SK Hynix in 2013, and the first devices to use HBM were the AMD Fiji
GPU A graphics processing unit (GPU) is a specialized electronic circuit designed for digital image processing and to accelerate computer graphics, being present either as a discrete video card or embedded on motherboards, mobile phones, personal ...
s in 2015. HBM was adopted by
JEDEC The Joint Electron Device Engineering Council (JEDEC) Solid State Technology Association is a consortium of the semiconductor industry headquartered in Arlington County, Virginia, Arlington, United States. It has over 300 members and is focused ...
as an industry standard in October 2013.High Bandwidth Memory (HBM) DRAM (JESD235)
JEDEC, October 2013
The second generation, HBM2, was accepted by JEDEC in January 2016. JEDEC officially announced the HBM3 standard on January 27, 2022.


Technology

HBM achieves higher
bandwidth Bandwidth commonly refers to: * Bandwidth (signal processing) or ''analog bandwidth'', ''frequency bandwidth'', or ''radio bandwidth'', a measure of the width of a frequency range * Bandwidth (computing), the rate of data transfer, bit rate or thr ...
than
DDR4 Double Data Rate 4 Synchronous Dynamic Random-Access Memory (DDR4 SDRAM) is a type of synchronous dynamic random-access memory with a high bandwidth ("double data rate") interface. Released to the market in 2014, it is a variant of dynamic rando ...
or
GDDR5 Graphics Double Data Rate 5 Synchronous Dynamic Random-Access Memory (GDDR5 SDRAM) is a type of synchronous graphics random-access memory (SGRAM) with a high bandwidth ("double data rate") interface designed for use in graphics cards, game con ...
while using less power, and in a substantially smaller form factor.HBM: Memory Solution for Bandwidth-Hungry Processors
, Joonyoung Kim and Younsu Kim, SK Hynix // Hot Chips 26, August 2014
This is achieved by stacking up to eight
DRAM Dram, DRAM, or drams may refer to: Technology and engineering * Dram (unit), a unit of mass and volume, and an informal name for a small amount of liquor, especially whisky or whiskey * Dynamic random-access memory, a type of electronic semicondu ...
dies and an optional base die which can include buffer circuitry and test logic. The stack is often connected to the memory controller on a
GPU A graphics processing unit (GPU) is a specialized electronic circuit designed for digital image processing and to accelerate computer graphics, being present either as a discrete video card or embedded on motherboards, mobile phones, personal ...
or
CPU A central processing unit (CPU), also called a central processor, main processor, or just processor, is the primary processor in a given computer. Its electronic circuitry executes instructions of a computer program, such as arithmetic, log ...
through a substrate, such as a silicon
interposer An interposer is an electrical interface routing between one socket or connection and another. The purpose of an interposer is to spread a connection to a wider pitch or to reroute a connection to a different connection.through-silicon via In electronic engineering, a through-silicon via (TSV) or through-chip via is a vertical electrical connection (Via (electronics), via) that passes completely through a silicon wafer or die (integrated circuit), die. TSVs are high-performance i ...
s (TSVs) and microbumps. The HBM technology is similar in principle but incompatible with the
Hybrid Memory Cube Hybrid Memory Cube (HMC) is a high-performance computer random-access memory (RAM) interface for through-silicon via (TSV)-based stacked DRAM memory. HMC competes with the incompatible rival interface High Bandwidth Memory (HBM). Overview Hybr ...
(HMC) interface developed by
Micron Technology Micron Technology, Inc. is an American producer of computer memory and computer data storage including dynamic random-access memory, flash memory, and solid-state drives (SSDs). It is headquartered in Boise, Idaho. Micron's consumer produc ...
. HBM memory bus is very wide in comparison to other DRAM memories such as DDR4 or GDDR5. An HBM stack of four DRAM dies (4Hi) has two 128bit channels per die for a total of 8 channels and a width of 1024 bits in total. A graphics card/GPU with four 4Hi HBM stacks would therefore have a memory bus with a width of 4096 bits. In comparison, the bus width of GDDR memories is 32 bits, with 16 channels for a graphics card with a 512bit memory interface. HBM supports up to 4 GB per package. The larger number of connections to the memory, relative to DDR4 or GDDR5, required a new method of connecting the HBM memory to the GPU (or other processor). AMD and Nvidia have both used purpose-built silicon chips, called ''
interposer An interposer is an electrical interface routing between one socket or connection and another. The purpose of an interposer is to spread a connection to a wider pitch or to reroute a connection to a different connection.semiconductor device fabrication Semiconductor device fabrication is the process used to manufacture semiconductor devices, typically integrated circuits (ICs) such as microprocessors, microcontrollers, and memories (such as Random-access memory, RAM and flash memory). It is a ...
is significantly more expensive than
printed circuit board A printed circuit board (PCB), also called printed wiring board (PWB), is a Lamination, laminated sandwich structure of electrical conduction, conductive and Insulator (electricity), insulating layers, each with a pattern of traces, planes ...
manufacture, this adds cost to the final product. File:AMD@28nm@GCN 3th gen@Fiji@Radeon R9 Nano@SPMRC REA0356A-1539 215-0862120 Stack-DSC10301-DSC10363 - ZS-retouched (29514443756).jpg, HBM DRAM die File:AMD@28nm@GCN 3th gen@Fiji@Radeon R9 Nano@SPMRC REA0356A-1539 215-0862120 Stack-DSC10364-DSC10418 - ZS-DMap (29548526985).jpg, HBM controller die File:AMD@28nm@GCN 3th gen@Fiji@Radeon R9 Nano@SPMRC REA0356A-1539 215-0862120 DSC04466 (29461603171).jpg, HBM memory on an AMD Radeon R9 Nano graphics card's GPU package


Interface

The HBM DRAM is tightly coupled to the host compute die with a distributed interface. The interface is divided into independent channels. The channels are completely independent of one another and are not necessarily synchronous to each other. The HBM DRAM uses a wide-interface architecture to achieve high-speed, low-power operation. The HBM DRAM uses a 500 MHz differential clock CK_t / CK_c (where the suffix "_t" denotes the "true", or "positive", component of the differential pair, and "_c" stands for the "complementary" one). Commands are registered at the rising edge of CK_t, CK_c. Each channel interface maintains a 128bit data bus operating at double data rate (DDR). HBM supports transfer rates of 1 
GT/s In computer technology, transfers per second and its more common secondary terms gigatransfers per second (abbreviated as GT/s) and megatransfers per second (MT/s) are informal language that refer to the number of operations transferring data that ...
per pin (transferring 1 bit), yielding an overall package bandwidth of 128 GB/s.


HBM2

The second generation of High Bandwidth Memory, HBM2, also specifies up to eight dies per stack and doubles pin transfer rates up to 2 
GT/s In computer technology, transfers per second and its more common secondary terms gigatransfers per second (abbreviated as GT/s) and megatransfers per second (MT/s) are informal language that refer to the number of operations transferring data that ...
. Retaining 1024bit wide access, HBM2 is able to reach 256 GB/s memory bandwidth per package. The HBM2 spec allows up to 8 GB per package. HBM2 is predicted to be especially useful for performance-sensitive consumer applications such as
virtual reality Virtual reality (VR) is a Simulation, simulated experience that employs 3D near-eye displays and pose tracking to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment (particularly video gam ...
. On January 19, 2016,
Samsung Samsung Group (; stylised as SΛMSUNG) is a South Korean Multinational corporation, multinational manufacturing Conglomerate (company), conglomerate headquartered in the Samsung Town office complex in Seoul. The group consists of numerous a ...
announced early mass production of HBM2, at up to 8 GB per stack. SK Hynix also announced availability of 4 GB stacks in August 2016. File:AMD@14nm@GCN 5th gen@Vega10@Radeon RX Vega 64@HBM DRAM Die@ Stack-DSC08974-DSC09078 - ZS-retouched.jpg, HBM2 DRAM die File:AMD@14nm@GCN 5th gen@Vega10@Radeon RX Vega 64@HBM Logic Die@ Stack-DSC08709-DSC08834 - ZS-retouched.jpg, HBM2 controller die File:AMD@14nm@GCN 5th gen@Vega10@Radeon RX Vega 64@ES-Sample@ Stack-DSC01160-DSC01176 - ZS-retouched.jpg, The HBM2 interposer of a Radeon RX Vega 64 GPU, with removed HBM dies; the GPU is still in place


HBM2E

In late 2018, JEDEC announced an update to the HBM2 specification, providing for increased bandwidth and capacities. Up to 307 GB/s per stack (2.5 Tbit/s effective data rate) is now supported in the official specification, though products operating at this speed had already been available. Additionally, the update added support for 12Hi stacks (12 dies) making capacities of up to 24 GB per stack possible. On March 20, 2019,
Samsung Samsung Group (; stylised as SΛMSUNG) is a South Korean Multinational corporation, multinational manufacturing Conglomerate (company), conglomerate headquartered in the Samsung Town office complex in Seoul. The group consists of numerous a ...
announced their Flashbolt HBM2E, featuring eight dies per stack, a transfer rate of 3.2 
GT/s In computer technology, transfers per second and its more common secondary terms gigatransfers per second (abbreviated as GT/s) and megatransfers per second (MT/s) are informal language that refer to the number of operations transferring data that ...
, providing a total of 16 GB and 410 GB/s per stack. August 12, 2019,
SK Hynix SK Hynix Inc. () is a South Korean supplier of dynamic random-access memory (DRAM) chips and flash memory chips. SK Hynix is one of the world's largest semiconductor vendors. Founded as Hyundai Electronics in 1983, SK Hynix was integrated into ...
announced their HBM2E, featuring eight dies per stack, a transfer rate of 3.6 
GT/s In computer technology, transfers per second and its more common secondary terms gigatransfers per second (abbreviated as GT/s) and megatransfers per second (MT/s) are informal language that refer to the number of operations transferring data that ...
, providing a total of 16 GB and 460 GB/s per stack. On July 2, 2020, SK Hynix announced that mass production has begun. In October 2019, Samsung announced their 12-layered HBM2E.


HBM3

In late 2020,
Micron The micrometre (English in the Commonwealth of Nations, Commonwealth English as used by the International Bureau of Weights and Measures; SI symbol: μm) or micrometer (American English), also commonly known by the non-SI term micron, is a uni ...
unveiled that the HBM2E standard would be updated and alongside that they unveiled the next standard known as HBMnext (later renamed to HBM3). This was to be a big generational leap from HBM2 and the replacement to HBM2E. This new
VRAM Video random-access memory (VRAM) is dedicated computer memory used to store the pixels and other graphics data as a framebuffer to be rendered on a computer monitor. It often uses a different technology than other computer memory, in order to ...
would have come to the market in the Q4 of 2022. This would likely introduce a new architecture as the naming suggests. While the architecture might be overhauled, leaks pointed to performance similar to the updated HBM2E standard. This RAM was likely to be used mostly in data center
GPUs A graphics processing unit (GPU) is a specialized electronic circuit designed for digital image processing and to accelerate computer graphics, being present either as a discrete video card or embedded on motherboards, mobile phones, personal ...
. In mid 2021,
SK Hynix SK Hynix Inc. () is a South Korean supplier of dynamic random-access memory (DRAM) chips and flash memory chips. SK Hynix is one of the world's largest semiconductor vendors. Founded as Hyundai Electronics in 1983, SK Hynix was integrated into ...
unveiled some specifications of the HBM3 standard, with 5.2 Gbit/s I/O speeds and bandwidth of 665 GB/s per package, as well as up to 16-high 2.5D and 3D solutions. On 20 October 2021, before the JEDEC standard for HBM3 was finalised, SK Hynix was the first memory vendor to announce that it has finished development of HBM3 memory devices. According to SK Hynix, the memory would run as fast as 6.4 Gbit/s/pin, double the data rate of JEDEC-standard HBM2E, which formally tops out at 3.2 Gbit/s/pin, or 78% faster than SK Hynix's own 3.6 Gbit/s/pin HBM2E. The devices support a data transfer rate of 6.4 Gbit/s and therefore a single HBM3 stack may provide a bandwidth of up to 819 GB/s. The basic bus widths for HBM3 remain unchanged, with a single stack of memory being 1024-bits wide. SK Hynix would offer their memory in two capacities: 16 GB and 24 GB, aligning with 8-Hi and 12-Hi stacks respectively. The stacks consist of 8 or 12 16 Gb DRAMs that are each 30 μm thick and interconnected using Through Silicon Vias (TSVs). According to Ryan Smith of ''
AnandTech ''AnandTech'' was an online computer hardware magazine owned by Future plc. It was founded in April 1997 by then-14-year-old Anand Lal Shimpi, who was CEO and editor-in-chief until August 2014, with Ryan Smith replacing him as editor-in-chief. ...
'', the SK Hynix first generation HBM3 memory has the same density as their latest-generation HBM2E memory, meaning that device vendors looking to increase their total memory capacities for their next-generation parts would need to use memory with 12 dies/layers, up from the 8 layer stacks they typically used until then. According to Anton Shilov of ''
Tom's Hardware ''Tom's Hardware'' is an online publication owned by Future plc and focused on technology. It was founded in 1996 by Thomas Pabst. It provides articles, news, price comparisons, videos and reviews on computer hardware and high technology. The s ...
'', high-performance compute GPUs or FPGAs typically use four or six HBM stacks, so with SK Hynix's HBM3 24 GB stacks they would accordingly get 3.2 TB/s or 4.9 TB/s of memory bandwidth. He also noted that SK Hynix's HBM3 chips are square, not rectangular like HBM2 and HBM2E chips. According to Chris Mellor of ''
The Register ''The Register'' (often also called El Reg) is a British Technology journalism, technology news website co-founded in 1994 by Mike Magee (journalist), Mike Magee and John Lettice. The online newspaper's Nameplate_(publishing), masthead Logo, s ...
'', with JEDEC not yet having developed its HBM3 standard, might mean that SK Hynix would need to retrofit its design to a future and faster one. JEDEC officially announced the HBM3 standard on January 27, 2022. The number of memory channels was doubled from 8 channels of 128 bits with HBM2e to 16 channels of 64 bits with HBM3. Therefore, the total number of data pins of the interface is still 1024. In June 2022, SK Hynix announced they started mass production of industry's first HBM3 memory to be used with Nvidia's H100 GPU expected to ship in Q3 2022. The memory will provide H100 with "up to 819 GB/s" of memory bandwidth. In August 2022, Nvidia announced that its "Hopper" H100 GPU will ship with five active HBM3 sites (out of six on board) offering 80 GB of RAM and 3 TB/s of memory bandwidth (16 GB and 600 GB/s per site).


HBM3E

On 30 May 2023, SK Hynix unveiled its HBM3E memory with 8 Gbit/s/pin data processing speed (25% faster than HBM3), which is to enter production in the first half of 2024. At 8 GT/s with 1024-bit bus, its bandwidth per stack is increased from 819.2 GB/s as in HBM3 to 1 TB/s. On 26 July 2023, Micron announced its HBM3E memory with 9.6 Gbit/s/pin data processing speed (50% faster than HBM3). Micron HBM3E memory is a high-performance HBM that uses 1β DRAM process technology and advanced packaging to achieve the highest performance, capacity and power efficiency in the industry. It can store 24 GB per 8-high cube and allows data transfer at 1.2 TB/s. There will be a 12-high cube with 36 GB capacity in 2024. In August 2023, Nvidia announced a new version of their GH200 Grace Hopper superchip that utilizes 141 GB (144 GiB physical) of HBM3e over a 6144-bit bus providing 50% higher memory bandwidth and 75% higher memory capacity over the HBM3 version. In May 2023, Samsung announced HBM3P with up to 7.2 Gbit/s which will be in production in 2024. On October 20, 2023, Samsung announced their HBM3E "Shinebolt" with up to 9.8 Gbit/s memory. On February 26, 2024, Micron announced the mass production of Micron's HBM3E memory. On March 18, 2024, Nvidia announced the Blackwell series of GPUs using HBM3E memory On March 19, 2024, SK Hynix announced the mass production of SK Hynix's HBM3E memory. In September 2024, SK Hynix announced the mass production of its 12-layered HBM3E memory and in November the 16-layered version.


HBM-PIM

In February 2021,
Samsung Samsung Group (; stylised as SΛMSUNG) is a South Korean Multinational corporation, multinational manufacturing Conglomerate (company), conglomerate headquartered in the Samsung Town office complex in Seoul. The group consists of numerous a ...
announced the development of HBM with processing-in-memory (PIM). This new memory brings AI computing capabilities inside the memory, to increase the large-scale processing of data. A DRAM-optimised AI engine is placed inside each memory bank to enable parallel processing and minimise data movement. Samsung claims this will deliver twice the system performance and reduce energy consumption by more than 70%, while not requiring any hardware or software changes to the rest of the system.


HBM4

In July 2024, JEDEC announced its preliminary specifications for future HBM4. It lowered data rate per pin back to 6.4 Gbit/s/pin (the level of HBM3) but since it now employs a 2048-bit interface per stack (doubling that of the previous generations), it still achieves greater (1.6TB/s) data rate per stack than that of HBM3E. Additionally, it will allow 4GB layers (yielding 64GB in 16-layer configurations).


History


Background

Die-stacked memory was initially commercialized in the
flash memory Flash memory is an Integrated circuit, electronic Non-volatile memory, non-volatile computer memory storage medium that can be electrically erased and reprogrammed. The two main types of flash memory, NOR flash and NAND flash, are named for t ...
industry.
Toshiba is a Japanese multinational electronics company headquartered in Minato, Tokyo. Its diversified products and services include power, industrial and social infrastructure systems, elevators and escalators, electronic components, semiconductors ...
introduced a
NAND flash Flash memory is an Integrated circuit, electronic Non-volatile memory, non-volatile computer memory storage medium that can be electrically erased and reprogrammed. The two main types of flash memory, NOR flash and NAND flash, are named for t ...
memory chip with eight stacked dies in April 2007, followed by
Hynix Semiconductor SK Hynix Inc. () is a South Korean supplier of dynamic random-access memory (DRAM) chips and flash memory chips. SK Hynix is one of the world's largest semiconductor vendors. Founded as Hyundai Electronics in 1983, SK Hynix was integrated into ...
introducing a NAND flash chip with 24 stacked dies in September 2007. 3D-stacked
random-access memory Random-access memory (RAM; ) is a form of Computer memory, electronic computer memory that can be read and changed in any order, typically used to store working Data (computing), data and machine code. A random-access memory device allows ...
(RAM) using
through-silicon via In electronic engineering, a through-silicon via (TSV) or through-chip via is a vertical electrical connection (Via (electronics), via) that passes completely through a silicon wafer or die (integrated circuit), die. TSVs are high-performance i ...
(TSV) technology was commercialized by
Elpida Memory Micron Memory Japan, K.K. is a Japanese subsidiary of Micron Technology. It was formerly known as established in 1999 that developed, designed, manufactured and sold dynamic random-access memory (DRAM) products. It was also a semiconductor found ...
, which developed the first 8 GB
DRAM Dram, DRAM, or drams may refer to: Technology and engineering * Dram (unit), a unit of mass and volume, and an informal name for a small amount of liquor, especially whisky or whiskey * Dynamic random-access memory, a type of electronic semicondu ...
chip (stacked with four
DDR3 Double Data Rate 3 Synchronous Dynamic Random-Access Memory (DDR3 SDRAM) is a type of synchronous dynamic random-access memory (SDRAM) with a high Bandwidth (computing), bandwidth ("double data rate") interface, and has been in use since 2007. ...
SDRAM Synchronous dynamic random-access memory (synchronous dynamic RAM or SDRAM) is any DRAM where the operation of its external pin interface is coordinated by an externally supplied clock signal. DRAM integrated circuits (ICs) produced from the ...
dies) in September 2009, and released it in June 2011. In 2011,
SK Hynix SK Hynix Inc. () is a South Korean supplier of dynamic random-access memory (DRAM) chips and flash memory chips. SK Hynix is one of the world's largest semiconductor vendors. Founded as Hyundai Electronics in 1983, SK Hynix was integrated into ...
introduced 16GB DDR3 memory ( 40nm class) using TSV technology,
Samsung Electronics Samsung Electronics Co., Ltd. (SEC; stylized as SΛMSUNG; ) is a South Korean multinational major appliance and consumer electronics corporation founded on 13 January 1969 and headquartered in Yeongtong District, Suwon, South Korea. It is curr ...
introduced 3D-stacked 32GB DDR3 ( 30nm class) based on TSV in September, and then Samsung and
Micron Technology Micron Technology, Inc. is an American producer of computer memory and computer data storage including dynamic random-access memory, flash memory, and solid-state drives (SSDs). It is headquartered in Boise, Idaho. Micron's consumer produc ...
announced TSV-based
Hybrid Memory Cube Hybrid Memory Cube (HMC) is a high-performance computer random-access memory (RAM) interface for through-silicon via (TSV)-based stacked DRAM memory. HMC competes with the incompatible rival interface High Bandwidth Memory (HBM). Overview Hybr ...
(HMC) technology in October.
JEDEC The Joint Electron Device Engineering Council (JEDEC) Solid State Technology Association is a consortium of the semiconductor industry headquartered in Arlington County, Virginia, Arlington, United States. It has over 300 members and is focused ...
first released the JESD229 standard for Wide IO memory, the predecessor of HBM featuring four 128 bit channels with single data rate clocking, in December 2011 after several years of work. The first HBM standard JESD235 followed in October 2013.


Development

The development of High Bandwidth Memory began at AMD in 2008 to solve the problem of ever-increasing power usage and form factor of computer memory. Over the next several years, AMD developed procedures to solve die-stacking problems with a team led by Senior AMD Fellow Bryan Black.High-Bandwidth Memory (HBM) from AMD: Making Beautiful Memory
AMD Advanced Micro Devices, Inc. (AMD) is an American multinational corporation and technology company headquartered in Santa Clara, California and maintains significant operations in Austin, Texas. AMD is a hardware and fabless company that de ...
To help AMD realize their vision of HBM, they enlisted partners from the memory industry, particularly Korean company SK Hynix, which had prior experience with 3D-stacked memory, as well as partners from the
interposer An interposer is an electrical interface routing between one socket or connection and another. The purpose of an interposer is to spread a connection to a wider pitch or to reroute a connection to a different connection.UMC UMC may refer to: Organizations Companies * Ukrainian Mobile Communications, former name of Vodafone Ukraine, a mobile operator in Ukraine * Union Metallic Cartridge Company, a subsidiary of Remington Arms * United Microelectronics Corporation, Ta ...
) and
packaging Packaging is the science, art and technology of enclosing or protecting products for distribution, storage, sale, and use. Packaging also refers to the process of designing, evaluating, and producing packages. Packaging can be described as a coo ...
industry (
Amkor Technology Amkor Technology, Inc. is a semiconductor product packaging and test services provider. The company has been headquartered in Arizona, since 2005, when it was moved from West Chester, Pennsylvania, also in the United States. The company's Arizon ...
and
ASE ASE may refer to: Organisations * Academia de Studii Economice (the Economic Sciences Academy), in Bucharest, Romania * Admiralty Signal Establishment, a former defense research organization in the UK * ASE Group (Advanced Semiconductor Engineeri ...
). The development of HBM was completed in 2013, when SK Hynix built the first HBM memory chip. HBM was adopted as industry standard JESD235 by
JEDEC The Joint Electron Device Engineering Council (JEDEC) Solid State Technology Association is a consortium of the semiconductor industry headquartered in Arlington County, Virginia, Arlington, United States. It has over 300 members and is focused ...
in October 2013, following a proposal by AMD and SK Hynix in 2010. High volume manufacturing began at a Hynix facility in
Icheon Icheon (; ) is a Administrative divisions of South Korea, city in Gyeonggi Province, South Korea. Together with Yeoju, Icheon is known as a center of South Korean ceramic manufacturing and is a Creative Cities Network, UNESCO City of Crafts and ...
, South Korea, in 2015. The first GPU utilizing HBM was the AMD Fiji which was released in June 2015 powering the AMD Radeon R9 Fury X. In January 2016,
Samsung Electronics Samsung Electronics Co., Ltd. (SEC; stylized as SΛMSUNG; ) is a South Korean multinational major appliance and consumer electronics corporation founded on 13 January 1969 and headquartered in Yeongtong District, Suwon, South Korea. It is curr ...
began early mass production of HBM2. The same month, HBM2 was accepted by JEDEC as standard JESD235a. The first GPU chip utilizing HBM2 is the
Nvidia Tesla Nvidia Tesla is the former name for a line of products developed by Nvidia targeted at stream processing or GPGPU, general-purpose graphics processing units (GPGPU), named after pioneering electrical engineer Nikola Tesla. Its products began us ...
P100 which was officially announced in April 2016. In June 2016,
Intel Intel Corporation is an American multinational corporation and technology company headquartered in Santa Clara, California, and Delaware General Corporation Law, incorporated in Delaware. Intel designs, manufactures, and sells computer compo ...
released a family of
Xeon Phi Xeon Phi is a discontinued series of x86 manycore processors designed and made by Intel. It was intended for use in supercomputers, servers, and high-end workstations. Its architecture allowed use of standard programming languages and applicati ...
processors with 8 stacks of HCDRAM, Micron's version of HBM. At
Hot Chips French fries, or simply fries, also known as chips, and finger chips (Indian English), are '' batonnet'' or '' julienne''-cut deep-fried potatoes of disputed origin. They are prepared by cutting potatoes into even strips, drying them, and f ...
in August 2016, both Samsung and Hynix announced a new generation HBM memory technologies. Both companies announced high performance products expected to have increased density, increased bandwidth, and lower power consumption. Samsung also announced a lower-cost version of HBM under development targeting mass markets. Removing the buffer die and decreasing the number of TSVs lowers cost, though at the expense of a decreased overall bandwidth (200 GB/s). Nvidia announced Nvidia Hopper H100 GPU, the world's first GPU utilizing HBM3 on March 22, 2022.


See also

*
Stacked DRAM In computing, a memory module or RAM stick is a printed circuit board on which memory integrated circuits are mounted. Memory modules permit easy installation and replacement in electronic systems, especially computers such as personal computer ...
*
eDRAM Embedded DRAM (eDRAM) is dynamic random-access memory (DRAM) integrated on the same die or multi-chip module (MCM) of an application-specific integrated circuit (ASIC) or microprocessor. eDRAM's cost-per-bit is higher when compared to equivale ...
*
Chip stack multi-chip module Chip may refer to: Food * Chip (snack), thinly sliced and deep-fried gastro item ** Potato chips (US) or crisp (UK) * Chips (fried potato strips) (UK) or french fries (US) (common as a takeout side) * Game chips, thin chip/French fries * Choc ...
*
Hybrid Memory Cube Hybrid Memory Cube (HMC) is a high-performance computer random-access memory (RAM) interface for through-silicon via (TSV)-based stacked DRAM memory. HMC competes with the incompatible rival interface High Bandwidth Memory (HBM). Overview Hybr ...
(HMC): stacked memory standard from
Micron Technology Micron Technology, Inc. is an American producer of computer memory and computer data storage including dynamic random-access memory, flash memory, and solid-state drives (SSDs). It is headquartered in Boise, Idaho. Micron's consumer produc ...
(2011)


References


External links


High Bandwidth Memory (HBM) DRAM (JESD235)
JEDEC, October 2013 * *
HBM vs HBM2 vs GDDR5 vs GDDR5X Memory Comparison
{{DRAM Computer memory AMD technologies SDRAM American inventions South Korean inventions Taiwanese inventions