Image showing Lenovo(r) Neptune(TM) Liquid Cooled Compute trays

The University of Birmingham has today announced that it has been awarded £4m for a major project to develop a computing system aimed at helping researchers to speed up the scientific discovery process and provide new insights into important research questions.

The project is a collaboration between the University of Birmingham, The Rosalind Franklin Institute, The Alan Turing Institute and Diamond Light Source, the UK’s national synchrotron. It is being funded by the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation and an allocation of this service will be available to EPSRC-funded researchers. 

Called 'Baskerville' and named after John Baskerville, the enlightenment-era Birmingham industrialist, the Tier 2 accelerated compute facility will provide a state-of-the-art platform for graphics processing unit (GPU)-accelerated computing. It will help researchers to accelerate machine learning algorithms and simulation technology, with wide-ranging applications in computer vision, language processing, molecular modelling, and materials science.

Professor Iain Styles, of the University of Birmingham’s School of Computer Science and Director of the University’s Institute for Interdisciplinary Data Science and Artificial Intelligence, led the bid. He said: “Access to accelerated computing is now a major bottleneck in computational research. This facility will serve to accelerate progress in areas such as materials design, drug development and in machine learning research and all of its applications. We are delighted to be able to provide this new capacity to researchers at the University; at our partners – Diamond Light Source, the Rosalind Franklin Institute, and The Alan Turing Institute; and to the wider EPSRC research community.”

The facility will also provide a welcome boost to researchers working with The Alan Turing Institute, the UK national institute for Data Science and Artificial Intelligence.

Tomas Lazauskas, Senior HPC Research Software Engineer and Research Computing Team Lead at The Alan Turing Institute, commented: “High Performance Computing services are a vital prerequisite for our community’s cutting-edge research, as they provide computational resources and technological capabilities, in addition to a diverse range of expertise. Baskerville is a tremendous addition to the battery of computing facilities available to the Turing with a unique architecture of compute nodes which will enable researchers to target even more complex and data intensive workloads than before.”

Dr Mark Basham, Artificial Intelligence and Informatics Theme Lead at The Rosalind Franklin Institute, said: “The Rosalind Franklin Institute is dedicated to transforming life science through interdisciplinary research and technology development, specifically in the areas of Imaging. Biological imaging routinely collects huge quantities of data, and the extraction of information from these images requires significant computational resources. The cutting-edge compute cluster at Birmingham, with state-of-the-art GPU acceleration and high-speed memory architectures, is exactly the system that we need to address these computation problems for our grand challenge experiments.”

Dr Andrew Richards, Head of Scientific Computing and Data Management at Diamond Light Source, said: “The provision of a significant GPU based resource at the University of Birmingham in conjunction with Diamond Light Source and the Rosalind Franklin Institute provides an opportunity for Diamond to accelerate its use of external resources for computationally intensive data analysis. We are pleased to be involved in this activity enabling Diamond to further extend its ability to support users requiring access to GPU accelerated technology for their work.

“We will be working with Birmingham to automate data transfer and develop workflow management systems to enable Diamond users to access this resource as easily as possible. For new Beamlines like DIAD (Dual Imaging and Diffraction) which is also linked to the University of Birmingham, this opens opportunities for accessing significant resources to process large volumes of imaging data in new and innovative ways.”

The new resource will be built with 46 Lenovo® Neptune™ liquid cooled servers each featuring twin Intel Xeon next generation CPUs, 512GB system RAM and 980Gb local NVMe storage built to support four of NVIDIA's new A100 Tensor Core GPUs attached to each system via high performance PCIe gen4 connection.  In total 184 GPUs across the 46 compute nodes will provide over 2 Petaflops of computing resource. To support large data set processing, Lenovo will also be providing ~5PB (usable) of spinning disk and ~0.5PB (usable) of flash storage offered on the Lenovo DSS-G storage systems running IBM® Spectrum Scale™ RAID. The storage and compute systems will all be inter-connected using NVIDIA® Networking ConnectX® HDR InfiniBand adapters and Mellanox Quantum™ switches, providing 200Gbps links, to support high-throughput and scale-out workloads.

The system will be housed at the University of Birmingham’s dedicated research data centre, which opened in 2018 and was the result of a £6m investment by the University and equipped with highly efficient liquid cooling infrastructure.

Simon Thompson, Research Computing Infrastructure Architect at The University of Birmingham said: “The system we have designed provides an optimised solution based on scale-out hardware and will deliver some of the latest generation accelerated compute to our researchers.”

 “Through our investment in liquid cooling, we are well placed to support accelerated computing both now, and in the future and we look forward to the challenges of delivering a Tier 2 facility to the UK.”

“We have worked in partnership with the University of Birmingham for a number of years are were proud to have it as a founding member in our Project Everyscale, a global consortium of leading HPC visionaries”, said Scott Tease, General Manager of HPC and AI at Lenovo.  “As one of our longest standing HPC partners, we’re delighted to be supplying this system under our early access programme and Birmingham is one of the few UK sites, which is able to take such a system thanks to  following its strategic investment in liquid cooling infrastructure.

“The fact the University is able to deliver over two PetaFlops of compute power in just two racks shows how important investment in infrastructure is to deliver sustainable supercomputing as we enter the Exascale era” said Tease.

The system will be supplied by Lenovo® via the University of Birmingham’s research computing framework partner OCF Limited who will support the logistics of delivering the system. Integration and operation of the facility will be by the University’s Advanced Research Computing team alongside the Birmingham Environment for Academic Research (BEAR)[1] , Dr Andrew Edmondson (Research Software Group Lead) and Simon Thompson (Research Computing Infrastructure Architect) will lead the teams at Birmingham delivering and supporting the facility.

  • For media enquiries please contact Beck Lockwood, Press Office, University of Birmingham, tel: +44 (0)781 3343348.
  • The University of Birmingham is ranked amongst the world’s top 100 institutions. Its work brings people from across the world to Birmingham, including researchers, teachers and more than 6,500 international students from over 150 countries.
  • NVIDIA, NVLink, ConnectX and Mellanox Quantum are trademarks and/or registered trademarks of NVIDIA Corporation in the United States and other countries, or both.
  • Lenovo, Neptune  and Exascale to Everyscale are trademarks and/or registered trademarks of Lenovo in the United States, other countries, or both.
  • IBM, IBM Spectrum Scale are trademarks and/or registered trademarks of International Business Machines Corporation (“IBM”) in the United States, other countries, or both.
  • Other company and product names may be trademarks of the respective companies with which they are associated.
  • More information about the EPSRC Research Infrastructure theme.