We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.


Understanding the Weather That Keeps NASA Flying High

Understanding the Weather That Keeps NASA Flying High content piece image
An artist's depiction of NASA's Orbiting Carbon Observatory 2 (OCO-2). Launched in 2014, OCO-2 measures atmospheric carbon around the planet from space.
Listen with
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 6 minutes

The National Aeronautics and Space Administration (NASA) is widely known for its space exploration activities. But NASA also has a deep interest in looking towards our own planet. The agency currently has 31 Earth Observing System (EOS) satellites orbiting the globe, gathering data about weather, oceans, precipitation, climate, chemistry, clouds, fire and many other phenomena occurring across the planet’s surface. The scientists at NASA’s Global Modeling and Assimilation Office (GMAO), located at the Goddard Space Flight Center in Greenbelt, Maryland, are tasked with maximizing the use of the observations that NASA and other national and international agencies acquire.

“We provide as much understanding as we can to the workings of the atmosphere and our earth system as a whole through the use of those observations and computer modeling and prediction,” explained Dr. Bill Putman, a GMAO researcher.

Pilots, researchers, satellite engineers, climatologists, meteorologists and others all utilize this information.

GMAO research

For example, every six hours, five million observations are collected from various EOS meteorological satellites, weather balloons and sensors deployed on land, in aircraft, on ships and on ocean buoys around the world. That data is run through an ensemble of applications in the GMAO’s Goddard Earth Observing System (GEOS) model to perform short-term (up to a week) weather predictions on a global scale with 12 km2 (4.6 mi2) resolution.

The surface area of our planet is 510,072,000 km2 (196,939,900.2 mi2). To produce global simulations with a 12 km2 resolution, the GEOS model divides Earth’s surface into 42,506,000 grid cells. For each cell, the model includes a temperature, moisture, wind speed and pressure measurement, plus estimates of cloud processes—such as up-drafts and diabatic heating—that occur on an even smaller scale and must be parameterized. Over that six-hour period, these measurements and estimates are continuously integrated in the calculations needed to model the weather over that timeframe. Other studies can run simulations with as little as 1.5 km2 (.6 mi2) resolution. Predicting weather and modeling climate at these scales take a lot of computing power.

Dr. Putman and others at the GMAO develop the codes and applications used to analyze various data and generate information. Once proven, the codes become products consumable by others at NASA and elsewhere. Some of these products include:

  • The short-range numerical weather predictions mentioned above to support activities like aircraft and satellite missions.
  • Longer-range, seasonal predictions (weeks to months), often in collaboration with other agencies, such as the National Oceanographic and Atmospheric Administration (NOAA).
  • Other large collaborative efforts with national and international organizations.
  • Re-analyses of very long-term (years to decades) climate and atmospheric conditions.

GMAO also provides computational research and development that often leads to new data products or advanced production systems later.

“We support NASA field campaigns,” said Dr. Putman. “For example, there's been a lot of emphasis on aerosol and cloud interactions recently. NASA funded aircraft missions to observe various such phenomena. We provided forecast support for them in terms of flight paths and other input where they're looking for specific meteorological phenomena.”

Another example is the Orbiting Carbon Observatory 2 (OCO-2). Launched in 2014, OCO-2 measures atmospheric carbon around the planet from space. During its development, starting in 2012, Dr. Putman’s team ran a 7 km2 (2.7 mi2) global simulation of several atmospheric aerosols and chemicals that included carbon monoxide (CO) and carbon dioxide (CO2). The OCO-2 project developers used GMAO’s data to further refine their project plans and implementation for measuring carbon in Earth’s atmosphere. The OCO-2 satellite continues to provide data to NASA and the scientific community today. The OCO-3 instrument was launched May 4, 2019 and will expand our understanding of CO2 in Earth’s atmosphere.

According to Dr. Putman, aerosols have an impact beyond standard weather prediction timescales. For seasonal prediction capabilities and climate prediction capabilities, the interactions with aerosols and clouds and radiation are substantially important. But there are shorter-term atmospheric predictions that came out of GMAO’s work with the OCO-2 project.

“From that simulation,” explained Dr. Putman, “we ran a shorter-period simulation that included a full reactive chemistry. More than just the transport of aerosols and gaseous species, we looked at full interactions of over 200 chemical species in the atmosphere, such as surface ozone concentrations. That research was a precursor to a production composition forecast product of five-day air quality forecasts offered today.”

Besides short-term, long-term, and field support, GMAO scientists provide simulations over very long periods, called re-analyses.

“Reanalysis is kind of the merger of all of our activity. It maximizes the use of the observing system, integrating observations and modeling to provide a baseline for climate across decades.”

Scientists integrate observations over the last 30 or more years with the data simulation system to produce a record of the atmosphere over that period. This produces what GMAO calls a ‘climate quality’ re-analysis for studying trends and to use as a baseline for the current period. Many climate prediction groups will use NASA re-analyses as their baseline for their predictions.

“There are only a few centers around the world,” said Dr. Putman, “that produce these types of high-quality re-analyses. They become very reputable data sets and useful for services around the nation and the world.”

These snapshots from 40-day simulations beginning on August 1, 2016 demonstrate the representation of convective clouds in the GEOS model. Shown here are simulated infrared brightness temperatures at resolutions ranging from 200 to 3 kilometers (km) compared with observed data at 4-km resolution (lower right). (Visualizations by William Putman, NASA Goddard Space Flight Center, courtesy of NASA).

Supercomputing for weather and climate

To support the level of computing that GMAO needs, the NASA Center for Climate Simulation (NCCS) in the Computational and Information Sciences and Technology Office (CISTO) deploys large capability supercomputing clusters. Their current system, called Discover, is an evolution over several years of scaling out systems to accommodate GMAO’s (and other departments’) computational demands.

To date, Discover
has comprised 15 separate systems called Scalable Compute Units (SCUs); six SCUs are currently operating. It is designed for fine-scale, high-fidelity atmosphere and ocean simulations that span from days to decades and centuries, depending on the application (weather prediction, atmospheric and climate modeling, ensemble forecasts, and etc.).

Discover supports GMAO, the Goddard Institute for Space Studies (GISS), multiple field campaigns, and other NASA science research,” said Dan Duffy, CISTO Chief. “Every year we upgrade a portion of the cluster.” These systems are provided by a variety of vendors, such as IBM, Dell, and SGI (now part of HPE).

According to Duffy, Discover’s SCUs are grouped into a few islands based on their interconnect technology, such as InfiniBand 100 Gbps, InfiniBand 50 Gbps, and Intel Omni-Path Architecture 100 Gbps. This year, SCU15, built by Aspen Systems, has been added, bringing Discover to over 129,000 cores and a peak capacity of nearly 6.8 petaFLOPS. SCU15 comprises 160 Supermicro Twin Pro servers with Intel Xeon Gold 6148 processors interconnected by Intel Omni-Path Architecture. Today’s global short-term prediction models with 42,506,000 grid cells are computations GMAO could not have done before Discover.

Better technology means greater insight

“The advances in computing technology have allowed us to scale from 50-km2 grid spacing 20 years ago with a single weather prediction per day to today’s 12-km2 spacing with predictions done every six hours,” commented Dr. Putman. “We went from a three-dimensional variational data assimilation system to a four-dimensional system that spreads the information from observations more consistently over the six-hour assimilation window.”

“They’ve also been able to run forecasts and models with up to ten times that resolution,” added Duffy. “Using one of the SCUs with over 30,000 cores, they ran a global prediction at 1.5 km2, which was the highest-resolution global atmosphere simulation run by a US model at that time.”

According to Dr. Putman, each new cluster delivers new capability to not only run finer resolutions, but to experiment with their models to understand how they can improve their production products. For example, the very fine 1.5 km2 research model turned into a much longer-term prediction capability with 3 km2 grid spacing now offered as a research product. And, they are developing Artificial Intelligence (AI) algorithms that can make their most expensive simulations, such as long-term simulations of chemistry interactions, run more efficiently.

“We also get the benefit of providing higher resolution processes to instrument teams and satellite collaborations who are working on future designs,” added Dr. Putman. “By providing them a comprehensive simulation at say three kilometers over a long period of time, they can do observing system simulation experiments. They can treat our simulation as though it were reality, sample those simulations using their various instruments to plan their future observing missions, and we can provide analysis of their observations and the impact those observations might have in future prediction systems.”

The GMAO is a critical contributor to NASA and atmospheric science. Scientists such as Dr. Putman provide NASA and the wider community of climatologists and meteorologists with richer tools and higher-quality information to attain deeper insight into our planet’s day-to-day weather and health. With ever-expanding computational capacity from supercomputers like Discover, the continued evolution of the GEOS model, and more advanced prediction products, GMAO can further their mission to support NASA’s ongoing work around – and over – the globe.

Find out more about the Goddard Space Flight Center at https://www.nasa.gov/goddard; learn about the GMAO at https://gmao.gsfc.nasa.gov/; and read about the Discover cluster at https://www.nccs.nasa.gov/systems/discover.   

This article was produced as part of Intel’s editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC and AI communities through advanced technology. The publisher of the content has final editing rights and determines what articles are published.