Abstract:
"The development of shader graphs has been a transformative aspect of computer graphics,
allowing for creating detailed visual textures through a node-based interface. Despite being
innovative, this process has always required a lot of trial and error and consumed a lot of time due to complex node systems and having to go through a steep learning curve. Current systems have primarily generated surface materials with robust datasets such as Adobe Substance, which do not address the complex requirements of volumetric material effects or other materials like water, fog and fire in shader graph creation.
To overcome these constraints, this study presents a novel dataset containing both normal and volumetric features designed graphs to expand the possibilities for shader graph creation. By utilising this dataset, the system combines the most recent developments diffusion models in graph generation to create complex shader graphs automatically and render it into a 3d object. This method not only saves a lot of time and effort by automating the shader creation process, but it also opens up access to complex shader production, enabling artists of all skill levels to use it.
The solution prototype which is implemented on top the Digress is trained using the new shader dataset and uses a multi head attention layer to condition the model on text embeddings generated using the CLIP modal. The initial test were conducted averaging 300 epochs on the dataset and was able to get an average clustering score of 3.2. which aligns with the existing molecular
generation metrices."