Zellmann, S., Aumüller, M., Marshak, N., Wald, I.: High-Quality Rendering of Glyphs Using Hardware-Accelerated Ray Tracing. In: Frey, S., Huang, J., und Sadlo, F. (Hrsg.) Eurographics Symposium on Parallel Graphics and Visualization. Eurographics Symposium on Parallel Graphics and Visualization (2020).
Glyph rendering is an important scientific visualization technique for 3D, time-varying simulation data and for higherdimensional data in general. Though conceptually simple, there are several different challenges when realizing glyph rendering on top of triangle rasterization APIs, such as possibly prohibitive polygon counts, limitations of what shapes can be used for the glyphs, issues with visual clutter, etc. In this paper, we investigate the use of hardware ray tracing for high-quality, highperformance glyph rendering, and show that this not only leads to a more flexible and often more elegant solution for dealing with number and shape of glyphs, but that this can also help address visual clutter, and even provide additional visual cues that can enhance understanding of the dataset.
Aumüller, M.: Hybrid Remote Visualization in Immersive Virtual Environments with Vistle. In: Childs, H. und Frey, S. (Hrsg.) Eurographics Symposium on Parallel Graphics and Visualization. Eurographics Symposium on Parallel Graphics and Visualization (2019).
Because of the spatial separation of high performance compute resources and immersive visualization systems, their combined use requires remote visualization. Remote rendering incurs increased latency from user interaction to display. For immersive virtual environments, this latency is a bigger problem than for desktop visualization. With hybrid remote visualization we enable the exploration of large-scale remote data sets from immersive virtual environments. This is based on three factors: When appropriate, we enable the local rendering of remote objects. We decouple local interaction from remote rendering as far as possible by depth compositing of remote and local images at a rate independent from remote rendering. Finally, we try to hide this latency by reprojecting 2.5D images for changed viewer positions. In this paper we describe the integration of hybrid remote rendering into the data-parallel visualization system Vistle as well its extension to a distributed system. Thereby arbitrary combinations of object-based and image-based remote visualization become possible.
Abstract Numerical analysis of large-scale and multidisciplinary problems on high-performance computer systems is one of the main computational challenges of the 21st century. The amount of data processed in complex systems analyses approaches peta- and exascale. The technical possibility for real-time visualization, post-processing and analysis of large-scale models is extremely important for carrying out comprehensive numerical studies. Powerful visualization is going to play an important role in the future of large-scale models. In this paper, we describe several software extensions aimed to improve visualization performance for large-scale models and developed by our team for 3D virtual environment systems such as \CAVEs and Powerwalls. These extensions include an algorithm for real-time generation of isosurfaces on large meshes and a visualization system designed for massively parallel computing environment. Besides, we describe an augmented reality system developed by the part of our team in Stuttgart.
Vistle is a scalable distributed implementation of the visualization pipeline. Modules are realized as MPI processes on a cluster. Within a node, different modules communicate via shared memory. TCP is used for communication between clusters. Vistle targets especially interactive visualization in immersive virtual environments. For low latency, a combination of parallel remote and local rendering is possible.
Zellmann, S., Aumüller, M., Lang, U.: Image-Based Remote Real-Time Volume Rendering: Decoupling Rendering From View Point Updates.ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. S. 1385-1394. ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference (2012).
Remote rendering is employed when the visualization task is too challenging for the hardware used to display a dataset or when it is too time consuming to transfer the complete dataset. Volume visualization with its dataset sizes growing with the 3rd power of their spatial resolution is such a task. Since remote rendering introduces additional sources of latency, its applicability to virtual environments is limited because of the required low delays from user action to displayed image. We counter these latencies with image-based rendering techniques: color image data along with additional depth information is warped, while new data has not been completely received. Using these approximate images, it is possible to decouple the cheap display phase from rendering. While depth values are trivially deduced for polygons, we contribute heuristics for volumetric datasets with varying transparency.
With the availability and easy accessibility of high performance computing resources, product development in engineering application shifted from experiments and model tests to computer simulations almost exclusively. Even after the initial construction of a satisfying design in the rapid prototyping phase, usually much potential for optimizations remains. To achieve an optimal design requires the execution of multiple simulation workflows with different parameters, concerning optimal operation at different operating points. Gaining insight from simulation results into how the changes to a model affect the performance of the simulated machine requires an interactive exploration of these datasets through-ideally-in situ post-processing. As the complexity of simulation data increases, traditional post-processing of datasets on single workstations is no longer possible due to limitations in system memory, network bandwidth and compute power. In this paper, we present how remote HPC resources can be used for interactive design, simulation and interactive analysis to support engineering workflows in product development.
Stewart, C.A., Keller, R., Repasky, R., Hess, M., Hart, D., Müller, M.S., Sheppard, R., Wössner, U., Aumüller, M., Li, H., Berry, D.K., Colbourne, J.: A Global Grid for Analysis of Arthropod Evolution. In: Buyya, R. (Hrsg.) GRID. S. 328-337. GRID (2004).
Maximum likelihood analysis is a powerful technique for inferring evolutionary histories from genetic sequence data. During the fall of 2003, an international team of computer scientists, biologists, and computer centers created a global grid to analyze the evolution of hexapods (arthropods with six legs). We created a global grid of computers using systems located in eight countries, spread across six continents (every continent but Antarctica). This work was done as part of the SC03 HPC challenge, and this project was given an HPC challenge award for the "most distributed application". More importantly, the creation of this computing grid enabled investigation of important questions regarding the evolution of arthropods - research that would not have otherwise been undertaken. Grid computing thus leads directly to new scientific insights.