“we need for the huge masses of data, by the( ) and applications like autonomous driving occur, urgent one new, memory-oriented computer architecture”, the HPE Management explained again and again the grazing through the halls audience. “After all, a single vehicle per day produced four TB data” was fleshed Heiko Meyer, Managing Director HPE Enterprise Germany, before the press. The data used for autonomous driving not only, as previously, underpinning human interactions with the systems, but the results of their lightning-fast evaluation abut even rule-based actions that can adapt through machine learning again and again on the current situation. And for traditional architectures were much too slow, because awkward to obtain the data of the hard disk space in the work areas.
as well, looks out a Nodeboard? Looking at the photo, far left to recognize a gateway device, which logically linking multiple Nodeboards on a double-based backbone. Physically connect to the redundant backbone consists of double-lined fast glass fibres, performed on the data using silicon photonics, the digital impulses are transformed largely so directly on the Board with integrated circuits in pulses of light. This saves time and makes the connections quickly. A connection consists of quick return and an equally rapid return channel respectively a 600 GBit/s, total are redundant 1.2 TBit/s available to communicate with the entire system per Nodeboard. The available bandwidth scalability through more as designed connections. Sometime is scheduled for the entire communication within a Nodeboards using light pulses, until then however the SiPho technology must be refined.
Gen-Z instead of PCIe
further to the right on the Board, you can see the core of architecture, the processor, and right of the diskless storage benches including programmable devices that route traffic to the correct store. Each processor gets exactly the memory that is required for the item being edited. No longer accessed with familiar technologies such as PCI or PCIe, but with the new interconnect standard Gen z
behind Gen Z, which multiplies the bandwidth for direct memory access to PCIe, is a rare broad panel of manufacturers. Actually, only a name is missing in the Gen-Z Consortium:. No wonder overslept the manufacturers but so far, to adapt its processor architectures with novel memory access mechanisms to the data age dawning up. It is quite wrong to consider Gen-Z as a frontal attack on Intel’s technology in this field. Therefore, you should watch carefully whether the manufacturer also joins this Panel in the near future or something of their own out of the hat casts.
contains more than a Nodeboard a total system of “The Machine”, an activity data can reside anywhere in the system. Each task gets so exactly so much needed memory as they, and, where it is just storage is free. However, the memory on the Board already for some should be sufficient. Fit up to a Petabyte. Each system has a double-based backbone, to which the Nodeboards be – connected side by side are currently planned up to ten.
“the machine”: interest of the user
the interest in “The machine”, stresses the HPE management unanimously, was daunting, because the technology is so far very time consuming processes magnitude faster. Andreas Hausmann, HPE Chief Technologist networking in Germany, could report about a cooperation with the completed on time for the CeBIT diseases (DZNE) Centre for neurovegetative German. “So far, it took a week to 14 days, until an image analysis was completed and the next step could be carried out. With “The Machine” that is reduced to hours or minutes.” Countless images that are stored in the memory of the system, could be compared more or less simultaneously with a new image, matching patterns to discover and to make appropriate diagnosis.
very unchanged survived but probably not a concept the first stages of its realization. The most important change concerns the software for “The Machine”. “We at this point have changed to our opinion something”, said Andrew Wheeler, Vice President and Deputy Director of theLabs. Originally a completely new operating system should be written – now new software modules in the open Linux community will be developed. “It is researched at several universities on a new operating system, but the open software is currently more important,” says Wheeler. The most significant difference to conventional Linux operating systems is that the complete input / output logic without substitution is omitted, because there are now no hard drives in “The Machine”. According to HPE’s often manage programs simply by “The Machine” to adapt, that to remove the input / output logic out of them, because it fulfils no function when working with “The Machine” – rich’s direct memory access.
the second point concerns the storage technology, with which the Nodeboards be equipped. The Memristoren provided are non-volatile memory, which require far less power than DRAM and where HPE long researches. Western Digital and SanDisk Potter on their realization in numbers on the basis of HPE developments. Wheeler: “memory manufacturers often work with tiny margins and the yield is currently still too small.” It could also be that Intel and Micron with their non-volatile also 3D point X (pronounced: 3D Crosspoint) technology faster than Western Digital reach the necessary production perfection. Then HPE would this type of memory in the machine use may be very first according to own statements – mainly non-volatile and optional, the motto.
well, anyway: in the next one or two years the manufacturer that wants to start to integrate – Silicon Photonics, the memory-centric architecture with the Gen-Z interconnect and nonvolatile random-storage as a replacement for the hard drive the technologies that underlie the machine step by step in its products. First products are completely based on the technology of the machine, could result from the combination with purchased SGI technologies in big data and HPE environment according to HPE.
the machine is a competition for products like SAP HANA or IBM Watson? HPE denied that “Watson and HANA could run wonderfully on ‘The Machine'”, said Wheeler. It is questionable whether IBM would just see that. In any case, this architecture is a reliance on the early days of digital computing. Because also was used memory that did not lose its data when you normally put him right at the beginning of the development of the computer – only, its capacity was much smaller. For more contributions to “The Machine”