Samsung said on Wednesday that it has developed a high bandwidth memory (HBM) that integrates AI processing.
The company installed AI engines into memory banks of the DRAM to maximize parallel processing.
This effectively shortens the distance data have to travel between the CPU and memory, which leads to increased performance and less power consumption.
Samsung said HBM-PIM has two times the performance of conventional HBM2. Energy consumption drops by over 70%, the company said.
The same HBM interface is used so customers don’t have to install additional software and hardware to use the chip.
Samsung said AI accelerators of its customers was testing HBM-PIM. Test will be wrapped up within the first half of 2021, it said.
The company will present the paper on HBM-PIM at International Solid-State Circuits Conference, which will run up to February 22.
Samsung is currently collaborating with US Argonne National Laboratory's Computing, Environment and Life Sciences arm.