View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Data
May 16, 2017updated 27 Jul 2017 7:21pm

The Monster Machine: HPE debuts world’s biggest single-memory computer

The Machine prototype boasts a huge 160TB memory.

By CBR Staff Writer

HPE’s Machine research project, the largest R&D programme in the company’s history, has arrived in a big way – and I cannot emphasise ‘big’ enough. In what HPE is claiming as the world’s largest single-memory computer, the prototype making its debut boasts a colossal ARM-powered, 160TB memory.

The Machine prototype is geared towards delivering HPE’s vision of memory-driven computing, which the company explains as ‘an architecture custom-built for the Big Data era’. Put simply, memory-driven computing puts memory, not the processor, at the center of the computing architecture. By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds.

READ MORE: HPE & Nvidia partner to accelerate deep learning & AI adoption

The Machine prototype’s impressive memory is capable of simultaneously working with the data of approximately 160 million books. It has never been possible to hold and manipulate whole data sets of this size in a single-memory system, which HPE says proves the potential of Memory-Driven Computing.

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of Hewlett Packard Enterprise.

“To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”

machine hpe

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory 4,096 yottabytes. To put that into context, with that potential memory you would be able to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time.

Content from our partners
Green for go: Transforming trade in the UK
Manufacturers are switching to personalised customer experience amid fierce competition
How many ends in end-to-end service orchestration?

“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and Director, Hewlett Packard Labs.

“The architecture we have unveiled can be applied to every computing category, from intelligent edge devices to supercomputers.”

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU