Sheet metal forming is crucial to the automotive industry and is used to form such things as doors, hoods and fenders. Often, the geometries of the finished components are extremely complex. Simulations have been used for many years to facilitate the design of the forming tools required for this purpose and can, for example, reduce the cost and effort of post-processing. Yet the fact that sheet metal forming simulations and reality only converge to a limited extent is a problem, not least because of the high cost of the machine tools, which can quickly run into seven figures in the automotive industry – per machine tool! Accurate materials data is required to further reduce the discrepancy between simulations and reality. A team headed up by Dr. Celalettin Karadogan of the Institute for Metal Forming Technology (IFU) and Dennis Hoppe of the High-Performance Computing Center Stuttgart (HLRS) is currently collaborating on an approach aimed at determining such data with a high degree of accuracy for the material models used in the simulations with as small an experimental effort and time expenditure as possible.
“These days,” Karadogan explains, “we use established materials models and simulations based on these models to calculate forming processes. Nevertheless,” he continues, “the simulated and physical materials differ from one another because, whilst we can empirically determine physical variables, such as the yield stress, for the material models, the measured data cannot be transferred directly into the computational model in the form of variables."
There are major differences between the behavior of the simulated and the physical material. A neural network is to remedy this situation. (Photo: Max Kovalenko)
Karadogan's team now wants to use AI – or more precisely, a neural network – to perform this transfer operation. “During the material tests,” he explains, “we project a pattern onto our samples, which we record along with the measured forces.” Both the pattern image data and the measured forces are fed into a neural network, which is then tasked with searching out the mathematical variables of the model against this background. The team numerically modifies the mathematical variables to cover as many materials as possible. "Rather than just the two measured values per test sample used in previous approaches,” says Karadogan by way of clarification, “this approach gives us 1000 measured values per sample.”
To achieve this, the neural network has to perform two billion simulations. As this is far more than can be achieved using a conventional computer, Karadogan's group has requested 100 million core hours of computing time on the HLRS's supercomputer Hawk, via the Gauss Centre for Supercomputing. A core is one of the supercomputer’s processing units. The researchers are now collaborating with Hoppe, Head of Service Management & Business Processes at the HLRS, and his team to combine simulations and AI – a combination that is being used increasingly in data science and is being researched intensively at the HLRS under the auspices of the CATALYST project. In a pilot project in which five million simulations served as training data for the neural network, the project participants have already succeeded in demonstrating that the approach works in principle.
— Michael Vogel
This article was initially published under the title "Improving Simulations" in the September 2021 issue of Forschung Leben, the magazine of the University of Stuttgart. Republished with permission.