Newcastle's New Supercomputer is a "Rocket" for Research
Researchers at Newcastle University are benefiting from a new HPC machine, called Rocket. Newcastle University has purchased a centralized HPC system to support all researchers across the University and replace existing departmental clusters as they come to end of life. The new HPC system is designed, integrated and configured by high performance compute, storage and data analytics integrator, OCF. The £2 million investment is funded by the University, supporting its dedication to providing world-class research facilities for its existing academics and attracting new researchers to the University.
With Rocket, the University aims to establish a ‘HPC culture’, encouraging cross-departmental research and collaboration and helping it remain a world-class research institution. Over 200 users are already benefitting from Rocket, including academics, researchers, staff and students from over 15 University departments, including medical, neuroscience, chemistry, biology, physics, mathematics, engineering, computing, business and geography.
“Workloads are driving an ever-growing set of data intensive challenges that can only be met with accelerated infrastructure,” said Werner Hofer, Dean of Research & Innovation at Newcastle University. ”Rocket provides the significant memory and fast processing we need for bulky, complex numerical computation. My post-doctoral researcher was able to process half a million CPU hours’ worth of calculations which was not at all possible with our previous processing power.”
Previously, what would take researchers sometimes months, even years to process using a single desktop computer can be run on Rocket as high-volume simulations, generating outcomes in a significantly reduced timeframe, saving valuable time and budget.
Rocket in action
- Assessing the extent of changes in flood simulation can now be simulated in 11 hours rather than 18 days on a desktop computer by using a change quantification algorithm.
- Brain geometry which used to take the equivalent of 270 people years can now be performed in just a few hours.
- Drug testing can be done with preliminary ‘in silicon’ research rather than ‘in vitro’, significantly reducing the need for animal research as computational chemistry is used to model and predict compounds which will be effective drugs.
- Using Deep Learning for the early detection of respiratory disease in pigs takes only two days to carry out 17,250,000 particle evaluations. This method used outperforms any other method and drastically improves time for Particle Swarm Optimisation (PSO).
- Identifying genetic mutation in genomes arising in cancerous cells. With Rocket, the run time for 30 genomes is now 12 hours compared to 28 days if it was run on a single core. The pipeline identified 100% of cancer-causing mutations creating the potential for more accurate, faster, targeted diagnosis of leukaemia types, resulting in improved, targeted treatment and better outcomes for patients.
- Identifying rare genetic diseases, Rocket can compare whole-exome sequences, looking for alignment and variant (across millions of sequences per genome), making it possible to search almost the entire protein coding region of a patient’s genome for nuclear genetic defects in one experiment in approximately one day.
Overall, this is a significant improvement on what was previously achievable with departmental HPC clusters, meaning researchers can now run much larger problems, getting more immediate answers to a range of “what if” scenarios.
The new 5,000 core system is comprised of Huawei HPC servers, housed in E9000 chassis with 110 CH121v3 nodes and CH122 nodes, including extra-large, large and medium memory nodes that are connected with Mellanox EDR Interconnect. This is supported by DDN EXAScaler® ES7KX® which provides 500 Terabytes (TB) of high speed storage, and by the University’s Research Data Warehouse, which provides two Petabytes (PB) of storage for data at rest.
“With OCF as our partner, we have access to the skills and knowledge we need to create our HPC system,” said Martin Edney, Faculty IT Manager at Newcastle University. “From inception, to design, consultancy and implementation, OCF has the in-depth experience to ensure Rocket meets the needs of every University department. The ongoing support service means that OCF is always on hand for expert guidance and support.”
“With OCF’s experience in managing large-scale data challenges, we knew we could put in place the right technology and services to support Newcastle University’s research work, both now and in the future,’ said Julian Fielden, Managing Director of OCF. “Rocket will help the University to remain a leading academic establishment globally, and continue to attract world-class researchers, as well as much needed grants and funding to continue with their research excellence.”
Edney adds: “Computational research is so important to us as an institution. Rocket is not just about buying a big computer, it’s about creating a HPC culture and allowing everyone to have access to achieve a better research impact, to generate new knowledge and insight.”
“We have a data explosion in every single aspect of science and society and we need the appropriate computational resources to be able to process this data,” said Dr Jaume Bacardit, Reader in Machine Learning at Newcastle University.
“People have to spend a lot of time writing proposals to get access to the big national or regional supercomputers so it’s a lot of work for us as academics. It’s much easier for us if we can just have a local machine that we can use,” said Dr Tamara Rogers, Reader in Computational Astrophysics at Newcastle University.
This article has been republished from materials provided by OCF. Note: material may have been edited for length and content. For further information, please contact the cited source.
Getting to Know the Microbes that Drive Climate ChangeNews
A new understanding of the microbes and viruses in the thawing permafrost in Sweden may help scientists better predict the pace of climate change.READ MORE
How Many People Die From TB Every Year?News
Discrepancies between the estimates for global tuberculosis deaths is due to different methodologies and data sources used by each institution. The results highlight the need to improve the modeling approaches in these countries in order to understand the true burden of the disease and design adequate health policies.READ MORE
ExPecto Patronum! Magical Machine Learning Tool Summons DNA Dark Matter DataNews
A new machine learning framework, dubbed ExPecto, can predict the effects of mutations in the so-called “dark matter” regions of the human genome. ExPecto pinpoints how mutations can disrupt the way genes turn on and off throughout your body.