Samsung unveiled new chips that combine memory and processor chips at Hot Chips 2023 conference on Tuesday.
The chips, high bandwidth memory (HBM)-processing in memory (PIM) and LPDDR-PIM, are aimed at future AI applications, the tech giant said.
Applying these chips for generative AI applications will yield double the accelerator performance and power efficiency compared to conventional HBM, Samsung said.
The study used AMD’s MI-100 GPU and Samsung, for mixture of experts (MOE) verification, built an HBM-PIM cluster.
The company used 96 units of MI-100 with HBM-PIM. The MOE model showed double the acceleration rate of HBM and three times the power efficiency, according to Samsung.
In an HBM-PIM, some of the processing capabilities of the CPU is allocated to the memory, which reduces the amount of data transfer between CPU and memory.
As AI uses massive amounts of data and causes memory bottlenecks, this concept of chip is designed to mitigate that.
Samsung also showcased LPDDR-PIM where PIM is applied to mobile DRAM so that calculations can be also done on the device end.
The tech giant said the chip has a bandwidth of 102.4GB and has 72% less power consumption compared to conventional DRAMs.