Samsung test memory with integrated processing for AI-centric servers • The Register



Samsung has moved ahead with plans to relieve devices from the tedious chore of moving data from memory to a processor – by putting more processing power into memory. It is already running on servers and should become something of a standard next year.

The Korean giant’s efforts use its very fast high-bandwidth memory (HBM) Aquabolt architecture – a technology to which the company added in-memory processing (PIM) capabilities in February 2021. Samsung hasn’t revealed much about it. details about its PIM implementation. , corn The register understands that this involves placing a processing unit with unspecified specifications next to each array of cells within memory.

In early 2021, Samsung announced that HBM and PIM were working together in the same piece of silicon. Yesterday it announced that it has made both work in a Xilinx Virtex Ultrascale + (Alveo) AI accelerator, and has also advanced HBM-PIM to a point where it is ready to be deployed in DIMMs and memory. mobile.

Using HBM-PIM in the Xilinx device provided 2.5 times system performance, while reducing power consumption by 60%.

Samsung is now talking about “AXDIMM” – accelerated DIMMs – and says the units are currently being tested on customer servers.

The company claims that an AXDIMM “can perform parallel processing of multiple rows of memory (DRAM chip sets) instead of accessing only one row at a time.” Test results suggest “approximately twice the performance in AI-based recommendation applications and a 40% decrease in system-wide power consumption.”

SAP has given Samsung technology a boost canned declaration, saying he likes the idea of ​​speeding up his in-memory databases.

Samsung also hinted at broader collaborations, saying it “plans to expand its portfolio of AI memories by working with other industry leaders to complete standardization of the PIM platform during the first year. semester 2022.

“The company will also continue to foster a very robust PIM ecosystem by ensuring wide applicability in the memory market,” according to its canned statement.

Which sounds pretty tasty, because who doesn’t want faster, more flexible memory that changes server behavior?

In fact, this is a more difficult question than it sounds, given that the software will not know about AXDIMMs from day one, or even year one. The fight to interest developers is one reason Intel’s Optane storage class memory hasn’t set the world on fire. And Optane also promised to make the servers faster and more stylish.

It’s lazy journalism to end a story saying that time will tell if a new product is successful. In the absence of a lot of details on Samsung’s technology and the partnerships on offer, it’s harder to say more at this point. ®



Source link

Previous Busybee leads digital transformation in post-pandemic Philippines
Next DH2i Extends SQL Server Platform to Containers