While conservation efforts and environmental regulations have played an important role in limiting damage to our lakes and rivers, human pollution and natural disasters are still capable of wreaking havoc. Recognizing this risk, the Baden-Württemberg Stiftung in 2017 funded an expansive, multidisciplinary project aimed at better understanding how water quality is affected when pollution enters a river or stream.
The effort brings together environmental scientists, biologists, civil engineers, and chemical engineers, among others, combining high-performance computing (HPC) simulation with experimental techniques. Among the team members is a group of researchers from the Karlsruhe Institute of Technology (KIT) who are using HPC resources at the High-Performance Computing Center Stuttgart (HLRS) to do complex multiphase turbulent flow simulations of how solid pollutants settle on a river bed. This work is helping to better understand how these pollutants become buried or actively move and spread.
“If you think about a canal or a river, you always have sediment, which might act in your benefit or not,” said KIT Professor Dr. Markus Uhlmann, principal investigator on the project. “Sediment could be good in the sense that it helps support the foundation of a bridge pier, or it can contain pollutants that can be carried downstream. To predict and control the spread of pollution it is important to know if particles are going to move and your bed is going to be eroded.”
Using HLRS’s Hawk supercomputer, the team conducted the first ab initio simulation — calculations that start from first principles — of the sediment pattern on a riverbed while also simulating settling contaminant particles from above, publishing its results in the Journal of Fluid Mechanics. The work also resulted in team member Markus Scherer being awarded an HLRS Golden Spike Award at the center’s October 2021 Results and Review workshop.
When modelling how pollutants disperse in rivers and other waterways, scientists rely on supercomputers like Hawk to better understand the chaotic, turbulent motions in fluid flows. In such studies, researchers do computationally expensive direct numerical simulations (DNS) that divide the fluid in question into a fine, three-dimensional grid — also called a computational mesh — and then solve equations in each grid box that represent changes in fluid motion over time
A visualization of a scalar contamination (green) emerging from the sediment bed (black/white particles). The flow is from left to right and what you see is a “ripple" feature. Image credit: Michael Krayer.
Simulating turbulent fluid motion by itself is a significant challenge, but in order to understand how pollutant particles spread in a waterway, researchers must do DNS calculations of so-called multiphase flows. These simulations also account for the ability of particles to influence and modify fluid flows. Although this approach results in a more realistic model, it also significantly increases the computational demands.
Solid particles also add another complication to the research: the team must model how particles disperse as they float toward the bottom of a river bed. This is not a trivial issue, as particles do not permanently stay where they land on the stream bottom. Because of water currents, the bottom of a river or stream is always in motion. When a pollutant first enters a waterway, this can actually be advantageous; the particles fall to the bottom of the river and can be buried by moving sediment. Unfortunately, over a longer period, the opposite is true: pollutants that were buried and forgotten can be re-released during flooding events as sediment is pushed downstream, allowing it to spread once again.
Predicting the cycle in which pollutants are buried and resurface is further complicated by the uneven nature of sediment motion. Grains of sand, pulverized stones, and other microscopic debris that form the sediment bed will often form “ridges,” creating an uneven distribution that further influences water flow and, in turn, how sediment moves downstream.
“The main thing we are interested in is the effect of different patterns in the sediment,” said Michael Krayer, a KIT researcher and collaborator on the project. “We started with simulations of these dune-like features, what we call ripples—they propagate with the current flow and propagate while moving from side to side, which is generally good for burying things, but the meandering ridges that form are much less effective at doing so. When charting pollutants’ movements, it makes a big difference what kind of pattern you have.”
“When it comes to these sediment ridges, they are closely related to very complex turbulent structures,” said Markus Scherer, KIT researcher and another collaborator on the project. “When modelling these interactions, you have to ask if you’ve captured everything, or do what we do—resolve all scales and see what is most relevant for studying the changes and movement of sediment ridges. This can be used, of course, for computational models that can run on more modest computing resources and be performed quicker in the event of an environmental emergency.”
With access to HLRS resources, the team has made significant progress in better understanding pollutant transport in waterways, although they know more work needs to be done. Uhlmann indicated that the team does not necessarily need to run a larger or higher-resolution version of its model, although in the future it would benefit from the ability to run “parameter sweeps.” Here, the team could make single modifications in the simulations and watch how they impact the entire system.
Running many DNS iterations is not only computationally demanding, it also creates massive datasets. The team has worked closely with HLRS user support staff to find ways to port its code to run more effectively on Hawk’s architecture, and to identify efficient approaches for processing datasets at HLRS before moving them to KIT’s Steinbuch Centre for Computing for long-term storage and further analysis.
Uhlmann indicated that HPC centres such as HLRS have an important role to play in helping researchers make the best use possible of these large systems and their associated data infrastructures. “User support staff at HLRS helped optimize memory access patterns on one of our old solvers, and it gave us a good improvement,” he said. “It is also important that HPC centres like HLRS look to the users when they get ready to buy hardware, and GCS centres have taken that into account. We’ve been part of discussions with GCS centres about what a new system deployment should look like, and I think that is very helpful.”
Funding for Hazel Hen was provided by Baden-Württemberg Ministry for Science, Research, and the Arts and the German Federal Ministry of Education and Research through the Gauss Centre for Supercomputing (GCS).