Type ofRAM | |
Developer | JEDEC |
---|---|
Type | Synchronous dynamic random-access memory |
Generation | 6th generation |
Predecessor | GDDR5 SDRAM |
Successor | GDDR7 SDRAM |
Graphics Double Data Rate 6 Synchronous Dynamic Random-Access Memory (GDDR6 SDRAM) is a type ofsynchronous graphics random-access memory (SGRAM) with a highbandwidth, "double data rate" interface, designed for use ingraphics cards,game consoles, andhigh-performance computing. It is a type ofGDDR SDRAM (graphicsDDR SDRAM), and is the successor toGDDR5. Just likeGDDR5X it uses QDR (quad data rate) in reference to the write command clock (WCK) and ODR (Octal Data Rate) in reference to the command clock (CK).[1]
The finalized specification was published byJEDEC in July 2017.[2] GDDR6 offers increased per-pin bandwidth (up to 16 Gbit/s[3]) and lower operating voltages (1.35 V[4]), increasing performance and decreasing power consumption relative toGDDR5X.[5][6]
AtHot Chips 2016,Samsung announced GDDR6 as the successor ofGDDR5X.[5][6] Samsung later announced that the first products would be 16 Gbit/s, 1.35 V chips.[7][8] In January 2018, Samsung began mass production of 16 Gb (2 GB) GDDR6 chips, fabricated on a10 nm class process and with a data rate of up to 18 Gbit/s per pin.[9][8][10]
In February 2017,Micron Technology announced it would release its own GDDR6 products by early 2018.[11] Micron began mass production of 8 Gb chips in June 2018.[12]
SK Hynix announced its GDDR6 products would be released in early 2018.[13][3] SK Hynix announced in April 2017 that its GDDR6 chips would be produced on a21 nm process and be 10% lower voltage than GDDR5.[3] The SK Hynix chips were expected to have a transfer rate of 14–16 Gbit/s.[4] The first graphics cards to use SK Hynix's GDDR6 RAM were expected to use 12 GB of RAM with a 384-bit memory bus, yielding a bandwidth of 768 GB/s.[3] SK Hynix began mass production in February 2018, with 8 Gbit chips and a data rate of 14 Gbit/s per pin.[14]
Nvidia officially announced the first consumer graphics cards using GDDR6, theTuring-basedGeForce RTX 2080 Ti, RTX 2080 & RTX 2070 on August 20, 2018,[15] RTX 2060 on January 6, 2019[16] and GTX 1660 Ti on February 22, 2019.[17] GDDR6 memory fromSamsung Electronics is also used for the Turing-basedQuadro RTX series.[18] The RTX 20 series initially launched with Micron memory chips, before switching to Samsung chips by November 2018.[19]
AMD officially announced theRadeon RX 5700, 5700 XT, and 5700 XT 50th Anniversary Edition on June 10, 2019. These Navi 10[20] GPUs utilize 8 GB of GDDR6 memory.[21]
Micron developed GDDR6X in close collaboration with Nvidia. GDDR6X SGRAM had not been standardized by JEDEC yet. Nvidia is Micron's only GDDR6X launch partner.[22] GDDR6X offers increased per-pin bandwidth between 19–21 Gbit/s withPAM4 signaling, allowing two bits per symbol to be transmitted and replacing earlierNRZ (non return to zero, PAM2) coding that provided only one bit per symbol, thereby limiting the per-pin bandwidth of GDDR6 to 16 Gbit/s.[23] The first graphics cards to use GDDR6X are theNvidia GeForce RTX 3080 and 3090 graphics cards. PAM4 signalling is not new but it costs more to implement, partly because it requires more space in chips and is more prone to signal-to-noise ratio (SNR) issues,[24] which mostly limited its use to high speed networking (like 200G Ethernet). GDDR6X consumes 15% less power per transferred bit than GDDR6, but overall power consumption is higher since GDDR6X is faster than GDDR6. On average, PAM4 consumes less power and uses fewer pins than differential signalling while still being faster than NRZ. GDDR6X is thought to be cheaper thanHigh Bandwidth Memory.[25]
Samsung announced the development of GDDR6W on November 29, 2022.[26]
Its improvements over GDDR6 are:
{{cite web}}
: CS1 maint: numeric names: authors list (link){{cite web}}
: CS1 maint: numeric names: authors list (link)