Leibniz Supercomputing Centre Introduces Plans for its Next-generation SuperMUC Machine
News Dec 15, 2017 | Original story from Gauss Centre for Supercomputing (GCS)
Leadership at the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences and Humanities announced today that they have signed a contract with Intel and Lenovo to build SuperMUC-NG, the next generation of the centre's leading-edge supercomputers.
SuperMUC-NG will be capable of 26.7 petaflops at its theoretical peak, or more than 26 quadrillion calculations per second. This represents a five-fold increase in computing power over that of the current-generation SuperMUC at LRZ. According to LRZ Director Prof. Dr. Dieter Kranzlmüller, SuperMUC-NG will provide researchers from a variety of scientific fields with significantly more capabilities.
"Our new supercomputer, SuperMUC-NG, will provide more compute power for scientists but will also require more expertise," he said. "Researchers will be able to tackle problems that are more complex, and to that end LRZ experts will assist them, providing an interface between the scientific community and computer science. We are well-prepared to support scientists in achieving the next level of supercomputing and as part of the project we will again extend our user support team."
LRZ, one of the three major German computing centres that constitute the Gauss Centre for Supercomputing (GCS), received funding for SuperMUC-NG's acquisition from GCS as well as the German federal government and the state of Bavaria, totalling €96 million over the machine's six-year lifecycle. GCS assumed half the cost with the two governments matching the other half.
Like its predecessor, SuperMUC-NG will use warm-water cooling, helping LRZ to further reduce power consumption and its carbon footprint (it will also reuse the heat generated on the machine to help generate cold water). Hardware supplier Lenovo focused the cooling concept on sustainability. "As an HPC hardware supplier, we concentrate on innovations related to performance, reliability, and sustainability," said Scott Tease, Executive Director of HPC and AI at the Lenovo Data Center Group. "All three themes come together in the context of our collaboration with LRZ and Intel to build out SuperMUC-NG."
The machine will consist of 6,400 compute nodes based on the Intel Xeon Scalable Processor that will be connected by Intel's Omni-Path network, using a so-called "fat tree topology. The system will be outfitted with more than 700 terabytes for its main memory, and have 70 petabytes of disk storage. Using next-generation interconnects and providing greater storage capabilities means that LRZ will be better-suited than ever before to address the increasingly difficult challenges of data management.
"We are happy to contribute an essential part to this important project, and in turn support the work going on at the Leibniz Supercomputing Centre," said Hannes Schwaderer, Head of Enterprise Sales at Intel Deutschland GmbH.
"Processing this data requires immense computational power. Next-generation Intel architecture will play an important role in helping to address data challenges across a broad spectrum of user needs."
The current-generation SuperMUC has made a major impact on many research areas. Its next generation will enable researchers to dramatically expand the scale and scope of their investigations.
For instance, a team led by Technical University of Munich Professor Dr. Michael Bader and Ludwig-Maximilians-Universität Munich researcher Dr. Alice-Agnes Gabriel used the current-generation SuperMUC to create the largest, longest ever multiphysics simulation of an earthquake and its resulting tsunami. The team recreated the 2004 Sumatra-Andaman Earthquake computationally, and was awarded best paper at SC17, the world's premier supercomputing conference, held this year in Denver, Colorado, USA.
Bader indicated that next-generation machines would help his team run many more iterations of its simulations. By testing its models with larger numbers of input parameters, he anticipates being able to achieve a better understanding of how earthquakes and tsunamis develop. Ultimately, this could lead to real-time solutions for mitigating their risks.
"Currently, we've been doing one individual simulation at a time, trying to accurately guess the starting configurations for things like the initial stresses and forces, but all of these are uncertain," he said. "We would like to run our simulation with many different settings to see how slight changes in the fault system or other factors can impact the study, but such large parameter studies would require yet another layer of performance from a machine."
This team, and the geophysics community in general, is just one of many research domains that will benefit from SuperMUC-NG.
This article has been republished from materials provided by Gauss Centre for Supercomputing (GCS). Note: material may have been edited for length and content. For further information, please contact the cited source.
Researchers Democratize Neuroscience by Making it Easier to Share Brain Imaging DataNews
Researchers have developed a set of tools to make one critical area of big data research — that of our central nervous system — easier to share.READ MORE
Researchers Fine-Tune Computer-Assisted Drug Repositioning Process to Treat Rare DiseasesNews
Researchers at the LSU Computational Systems Biology group have developed a sophisticated and systematic way to identify existing drugs that can be repositioned to treat a rare disease or condition.READ MORE