We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

HPC-powered Solutions Offer a Widescreen Approach to Scientific Visualization

HPC-powered Solutions Offer a Widescreen Approach to Scientific Visualization content piece image
NIU major Nolan Cooper poses in front of a display wall at the ddiLab. Credit: NIU via YouTube
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 4 minutes

What does it take for expert researchers to understand something as complex as an exploding star, or the functioning of the human brain? For starters, they need powerful codes to model and simulate their problems, and computers with hundreds of thousands of nodes to run their simulations. But mostly, they need lots and lots of data.

Ferreting discoveries out of the massive datasets generated in such domains is a challenge that demands the analysis and visualization capabilities of high-performance computing, or HPC. HPC systems generally aggregate computing power so very large problems can process in a useful amount of time and have been applied effectively by science for decades.More recently, smaller commodity systems, such as computing clusters running powerful software tools, have brought large-scale data analysis capabilities to universities and industry alike.

Northern Illinois University’s (NIU) new Hopcroft system offers HPC capabilities that support data analysis, simulation, and visualization workloads. The system resides in NIU’s Data, Devices and Interaction Laboratory (ddiLab), a workspace for students and faculty to perform collaborative computer science research. Hopcroft-generated data sets can be shared with the research community to advance project-specific goals. The data can also be converted to visualizations for further exploration on large-scale displays and interactive environments, such as the sizeable tiled visualization wall also housed in the ddiLab.

“In every field and every discipline, the amount of ‘minable’ data sets available to researchers will only continue to grow,” said Michael Papka, NIU professor of computer science and co-director of the ddiLab. “Our goal with Hopcroft is to provide a resource for HPC data analysis tasks and to help train students and future HPC users.”

ddiLAB

NIU Associate Professor and ddiLab co-director Joe Insley will use Hopcroft to process large data sets in preparation for visualization on the ddiLab wall, a 16-foot by eight-foot array of high-resolution monitors used for viewing and interacting with visualizations. The wall is surrounded by various other technologies, including cameras, Arduino-based systems, and Internet of Things (IoT) devices to capture users’ movements and gestures and to monitor the system controllers for commands. The ddiLab team also incorporated virtual reality and augmented reality technologies to enhance the visualization experience.


NIU Associate Professor Joe Insley works with an NIU graduate student on intuitive methods for interacting with data visualizations on the ddiLab display wall. Image courtesy of Northern Illinois University.


The environment itself is a research project and undergoes modifications based on student and faculty research, such as replacing the traditional keyboard-and-mouse interface with a solution informed by years of human-computer interface research.

Insley and Papka, both visualization specialists, began experimenting with various display technology solutions in graduate school to overcome the resolution limitations of the projectors available at the time. As the display technology continued to improve, Papka and Insley explored various configurations and added interactive technologies needed to support collaboration—work that continues at the ddiLab to this day.

“We explore intuitive approaches that worked for both scientists and non-scientists,” said Insley. “Our goal is to provide NIU’s researchers and visual artists with the maximum flexibility for a variety of use cases.”

Several technologies used in the ddiLab have an HPC component, which relates to Papka’s and Insley’s work at Argonne National Laboratory. Insley works as a team lead and visualization specialist, and Papka serves in several leadership roles, including as the director of the Argonne Leadership Computing Facility, a U.S. Department of Energy Office of Science user facility dedicated to open scientific discovery.

“While modern HPC systems offer ample performance, our ability to obtain, curate, and prepare data for visualization is an equally important requirement to do our work successfully,” said Insley. “Hopcroft supports both these things.”

Data and technology behind the scenes

To meet the demands for real-time simulations and visualizations, analysis, and data management on the same system, NIU turned to Atipa Technologies to implement Intel Select Solutions for Simulation & Visualization – a pre-validated and tested solution that combines second-generation Intel Xeon Scalable processors and other Intel technologies – and install Hopcroft for the HPC backend power.

“The combination of Atipa Polaris High Performance Computing and Visualization (HPCV) platform and Intel Select Solutions offer us a more turnkey HPC system for rapid deployment,” said Papka. “Intel’s Rendering Framework, and the ray-tracing algorithms involved in Hopcroft’s functionality eliminate the need for supplemental graphics-accelerating processors, too.”


Professor Michael E. Papka working with NIU undergraduate student Nolan Cooper on the tracking of 3D printed objects as tangible virtual display surfaces within an AR environment. Image courtesy of Northern Illinois University.


In addition to the Intel oneAPI Rendering Framework (which includes Embree and OSPRay), open-source applications like ParaView enable Hopcroft to help researchers tackle challenging real-time simulations with greater precision than possible before.

“Moreover, the Intel Rendering Framework allows us to easily decompose problems across many Hopcroft nodes and libraries such as Message Passing Interface (MPI),” said Papka. “By combining HPC nodes to maximize the available memory pool, we can accommodate data sets in the multi-terabyte range.”

Advancing artistic endeavors

While many HPC-driven visualization solutions used in academic environments focus on data-centric research, artists benefit too. Insley, who holds advanced degrees in both computer science and fine arts, hopes artists will use Hopcroft to create new art forms.

According to Insley, past approaches to 3D animation using workstations could require months of processing time. With HPC, systems can not only run faster but also parallelize workloads to avoid system bottlenecks.

“Most art students don’t initially think about computers and science applied to visual arts, and we want to change that,” said Insley. “We have the potential to inspire the next generation of artists to embrace technology as a tool to further their artistic vision. Computer-generated graphics are just a starting point in applying HPC to art.”

“For our team at NIU, being successful with exascale-level computing means a broader audience uses the technology. We want to get people excited about HPC because the technology has advanced to the point where we can accomplish previously-impossible tasks,” added Papka. “With HPC and visualization, we have stellar tools to educate future scientists and artists and inspire them to try new things.”

Rob Johnson spent much of his professional career consulting for a Fortune 25 technology company. Currently, Rob owns
Fine Tuning, LLC, a strategic marketing and communications consulting company based in Portland, Oregon. As a technology, audio, and gadget enthusiast his entire life, Rob also writes for TONEAudio Magazine, reviewing high-end home audio equipment.

This article was produced as part of Intel’s editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC and AI communities through advanced technology. The publisher of the content has final editing rights and determines what articles are published.