Visiting Scholars Look Inside Simulation's "Black Box"

Keyvisual image main
PD Dr. Johannes Lenhard and Prof. Dr. Nicole Saam.

Researchers working with the HLRS Department of Philosophy of Computer Simulation strive for a clearer understanding of computational research and its limitations.

Simulation and other applications of high-performance computing have become indispensable tools for research in the basic and applied sciences. Employing HPC has had many positive impacts, helping to strive for a deeper understanding of our world and enabling the development of new technologies. As our reliance on computer simulation grows, however, many wonder how it is changing science, technology, politics, and society, and how we should react.

As a supercomputing center, HLRS is unusual in that for many years it has prioritized discussion about such issues. Through its Department of Philosophy of Simulation, HLRS promotes interdisciplinary dialogue among simulation scientists, philosophers, historians of science, sociologists, and other scholars whose insights enable reflection on the nature of simulation.

In 2018 and 2019, the Department of Philosophy of Simulation hosted visiting scholars PD Dr. Johannes Lenhard (Philosophy Department, University of Bielefeld) and Prof. Dr. Nicole J. Saam (Institute for Sociology, Friedrich-Alexander-Universität Erlangen-Nürnberg). Their research highlights some key questions facing the philosophy of simulation today, and their experiences at HLRS show the benefits of promoting multidisciplinary discourse.

Simulation yesterday and today

Johannes Lenhard initially trained as a mathematician, but in 2001 became captivated by philosophical and historical questions related to the growing use of computing across many scientific disciplines. He realized that a conceptual framework was needed for discussing the deeper implications of this trend.

"For philosophers of science," Lenhard says, "one important problem is what it means for a scientist to know something. In the case of computer simulation, is the knowledge it produces different from that produced by older methods? If so, how? As a historian, one can also ask how the conditions in which computer simulation is used have changed over the years. This gives us a better understanding of how simulation is used today." In his research Lenhard has written about features that distinguish today's computer simulation from earlier mathematical modelling.

One example is the change in experiments using simulation. Supercomputers enable scientists to easily adjust a mathematical model's parameters so that it approaches observed data. Deep learning algorithms using neural networks are perhaps the best examples of this approach; by adjusting parameters, scientists gain the ability to model almost any behavior. But although such a capability can be useful, Lenhard asks, is it still science?

"The idea that general laws can reveal order in chaos is historically connected to the idea that these laws can be formulated mathematically," Lenhard says. "But newer kinds of computer modelling have nothing to do with finding such general laws. Today, we hope to describe and manage things in a predictable way, even if we don't have a law for it. This is a completely new option."

The opacity of simulation

Perhaps the most interesting philosophical problem in computer simulation is the result of what philosophers call its "opacity." Often, a computer simulation behaves in unexpected ways. In many cases it is impossible to determine why, because it is impossible to mechanistically observe how an algorithm functions. "This turns the idea of mathematical modeling on its head," Lenhard says, "because we used to think that mathematical models should be able to make the causes of a behavior crystal clear."

During her residency at HLRS, Nicole J. Saam has been developing a systematic method for defining, categorizing, and measuring such opacity. Although her project is still evolving, such a model might address factors such as instability in the numerical model itself, methods to simplify a model to reduce computational effort, and the number of different research teams involved in developing a model, each of whom only understands a small piece of the project. Such criteria address the fact that opacity in computer simulation can have technical, mathematical, and social origins.

After defining the dimensions of opacity, Saam has developed a scientific framework and questionnaire. Her team has started conducting interviews with groups of scientists who run their simulations at HLRS in order to gain a more precise understanding of opacity in all of its different forms.

"Opacity can look quite different from the perspective of a principal investigator in comparison with that of a graduate student," Saam points out. "Because of each's degree of experience, their perceptions of opacity can be different. Is opacity an objective condition of simulation, or is it a subjective experience that varies from individual to individual? As defined now, opacity is a relational concept. We would love to have a way to measure this."

Saam also anticipates a practical use for such a tool. "If one could know in advance which kinds of simulation models typically suffer from specific problems due to opacity," she suggests, "one might be able to identify and avoid typical problems early in the process of implementing a model." What might at first seem to be a theoretical issue could thus enable tangible improvements in how simulation is done.

The value of interdisciplinary dialogue

Bridging the conceptual divides between scientists and researchers in the humanities and social sciences is not easy, but Saam's project suggests why it is important to make the effort. "Sometimes it can take years or even decades to get scientists and researchers from diverse disciplines such as natural sciences, engineering sciences, humanities, or the social sciences to have a productive conversation," Saam remarks. "The opportunity for me to work directly with simulation scientists here at HLRS is very unusual."

Lenhard agrees. "I find it wonderful that HLRS has a group whose job it is to talk with interested scientists about their work, and to promote greater self-reflection about what they are doing," he observes. "It is something that the scientific community and society in general need to do more often." 

Christopher Williams