Big Data

HPE to Revolutionize Big Data, Enter “The Machine”

Just how much memory does Hewlett Packard Enterprise’s new computer called “The Machine” capable of storing? We’re talking big data, like really big.

A Limitless Amount of Big Data

In development since 2014, the company presented its prototype earlier this week specifically designed for the big data era. It uses a new kind of memory to be able to store and instantly analyze mind-boggling amounts of data. The Machine potentially even analyzes a limitless amount of data. The current prototype contains 160 terabytes (TB) of memory. Enough to store and work with every book in the Library of Congress five times over.

 

The World is Literally on its Shoulders

But that’s just scratching the byte surface on The Machine. HPE expects to eventually be able to build a machine that reaches up to 4,096 yottabytes. That would mean it’s big enough to hold all the data currently stored in the world … 250,000 times. The Machine can crunch through, “every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time,” HPE CEO Meg Whitman wrote in a blog post.

 

Whitman calls The Machine HPE’s moonshot device and expects the ability to analyze that much data will open up new frontiers of intellectual discovery. “No computer on Earth can manipulate that much data in a single place at once. And this is just the prototype,” she wrote.  

 

Here are the technical specs for this prototype:

  • 160 TB of shared memory across 40 physical nodes, interconnected using a high-performance fabric protocol
  • An optimized Linux-based operating system (OS) running on ThunderX2, Cavium’s flagship second generation dual socket capable ARMv8-A workload optimized System on a Chip
  • Photonics/Optical communication links, including the new X1 photonics module, are online and operational
  • Software programming tools designed to take advantage of abundant persistent memory.

 

“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO HPE and Director, Hewlett Packard Labs. “The architecture we have unveiled can be applied to every computing category-from intelligent edge devices to supercomputers.”

 

Memory-Driven Computing puts memory, not the processor, at the center of the computing architecture. By eliminating the inefficiencies of how memory, storage, and processors interact in traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems. This goes from days to hours, hours to minutes, minutes to seconds to deliver real-time intelligence, the company says.