Hewlett Packard Enterprise (HPE) presents the second prototype of his a single memory-based big-data computer “The Machine”. The new stage of development has 160 TB of memory and is according to HPE able, five times that of all the books of the library of Congress data at the same time to work – around 160 million books. It was formerly not possible so far, all records of this size in a system with uniform memory to hold and manipulate.
also provides the second prototype on 40 physical computer nodes that are connected to each other with a high performance fabric Protocol. As an operating system is an optimized Linux running on ThunderX2, the ARMv8-SoC architecture of the second generation of Cavium. “The Macine” can draw also modules, 1 Photonics on photonic communications, including the new X, so that software programming tools that can fully exploit the advantages of huge non-volatile memory.
chairwoman Meg Whitman sees great advantages in the enormous computing power: “the next great scientific breakthrough, groundbreaking innovations and technologies hide in sight behind the mountains of data that we generate every day. To be true to this promise, we can not rely on the technologies of the past. We need a computer that was specially created for the age of big data.”
Hewlett Packard Enterprise (HPE) had the first Protoytpen its since several years in preparing the computing architecture “The Machine” this year also on the exhibited. Elements of the new architecture should be used for HPE products already 2018 and 2019. The manufacturer wants to replace the used for about 60 years computer architectures with the world’s first memory centric computer architecture.
they are not suitable according to by HPE namely, the future tasks which, by cloud computing and the ( ) as well as mobile networks, and machine-to-machine computing to come on them, still satisfying to solve.
‘the machine’ specialized processors over optical connections with a universal memory communicate, however, on certain workloads. The separation of main memory and mass storage is therefore being repealed. HPE uses Memristoren for the new store.
byte-addressable non-volatile memory plans however, 2018 or 2019 from HPE. He should combine the benefits of DRAM in the performance and benefits of traditional memory in terms of capacity and resistance in itself. A step on the way then HPE memory is persistent. It was introduced in the spring and is available as an option for the HPE Server ProLiant DL360 and DL380 Gen9 available. They are especially designed for databases and analytics applications to support workloads and should help to eliminate bottlenecks for traditional storage environments.
optical technologies developed for “The Machine” should be introduced with the 2017 announced HPE synergy systems on the market. From 2018, the Photonics technology will be integrated into other product lines, even in some of the HPE-storage portfolio, by “The Machine”. The company plans to bring then also fabric attached storage products on the market, based on the high performance interconnect protocol developed by the recently founded Gen-Z Consortium . In addition to HPE include for example IBM, DellEMC, Huawei and Cray, and Micron, SK Hynx, and to the members.
order at the market introduction of new computer systems with “Memory-Driven Computing” software is also available, which can benefit from HPE works since 2016 with Hortonworks, which sparks have an in-memory database engine with Apache. Since June 2016 code available also on Github, to introduce programmers to the memory-driven architecture. 2017 should which code to integrate into existing systems and then form the basis for future analytics applications.