HPE steps closer to memory-driven computing

Hewlett Packard Enterprise has unveiled what is touted as the world’s largest single-memory computer, the latest milestone in The Machine research project.

The Machine, which is the largest R&D program in the history of the company, is aimed at delivering a new paradigm called memory-driven computing – an architecture custom-built for the big data era.

The prototype contains 160 terabytes (TB) of memory, capable of simultaneously working with the data held in every book in the Library of Congress five times over – or approximately 160 million books.

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and to a “nearly limitless” pool of memory – 4,096 yottabytes or 1,000 times the entire digital universe today.

With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time.

“We believe memory-driven computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE.

“The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers,” said Potter.

Memory-Driven Computing puts memory, not the processor, at the center of the computing architecture.

By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, memory-driven computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds – to deliver real-time intelligence.

RELATED ARTICLES

RELATED WHITEPAPERS

RELATED VIDEOS