Project Details
Projekt Print View

Rendering procedural textures for huge digital worlds

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2019 to 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 431478017
 
Final Report Year 2024

Final Report Abstract

Visually complex and tremendously large 3D data is a serious challenge in graphics applications such as movie productions. In a classical production pipeline, rendering of a 3D scene is separated from the generation of textures which typically contribute most of the visual details. It is very common that these textures are (also) obtained from procedural models which are well suited to create stochastic textures, e.g. to mimic natural phenomena. Procedural texturing is a generative approach where textures are compactly represented by a set of functions and procedures which are evaluated to produce a final texture. In this project we built on the Procedural Texture Graph (PTG) model, which represents the generative process as a graph where source nodes are mathematical functions, inner nodes are pixel processing operations and sink nodes are the final output textures. In a typical production pipeline, these textures are either computed upfront which becomes extremely storage demanding in real production environments today, or they are evaluated on-the-fly during texture accesses resulting in many redundant calculations. Our project was concerned with treating procedural texture synthesis and photorealistic rendering as one tightly coupled entity to make the rendering of highly detailed scenes feasible using texture synthesis on demand, and by this reduce memory requirements and redundant calculations. The project results contribute to different aspects along these overarching goals. One contribution targets the computation of textures representing glittering materials by microfacets models. Our idea was to directly generate the result of the PTG at different prefiltered levels using sparse dictionaries. This work resulted in three research papers, publicly available source code, and two popularization videos; it has also been integrated in an industrial rendering engine. We also developed a new node model for procedural texture graphs based on the composition of a vectorial Gaussian noise with a bi-variate transfer function. It allows for procedural generation of structured patterns. The results have been published as two research papers with accompanying source code. Further work targets efficient integration with raytracing. We developed a caching algorithm which exploits the additional knowledge that texture graphs provide: how textures evolve from individual basis functions at source nodes, and operations on the way to the sink nodes. The challenge was to identify what characteristics impact the caching needs. We found tight connections between computations required for path guiding methods for efficient illumination computation and texture footprint estimation for caching. This aspect has been studied and the applications of the findings first led to two published research papers with a focus on raytracing; research on their application to PTG caching is ongoing and will be published.

Publications

 
 

Additional Information

Textvergrößerung und Kontrastanpassung