Liu, MinghaoGrabli, StephaneSpeierer, SébastienSarafianos, NikolaosBode, LukasChiang, MattHery, ChristopheDavis, JamesAliaga, CarlosWang, BeibeiWilkie, Alexander2025-06-202025-06-2020251467-8659https://doi.org/10.1111/cgf.70170https://diglib.eg.org/handle/10.1111/cgf70170We present a novel generative model that synthesizes photorealistic, biophysically plausible faces by capturing the intricate relationships between facial geometry and biophysical attributes. Our approach models facial appearance in a biophysically grounded manner, allowing for the editing of both high-level attributes such as age and gender, as well as low-level biophysical properties such as melanin level and blood content. This enables continuous modeling of physical skin properties that correlate changes in skin properties with shape changes. We showcase the capabilities of our framework beyond its role as a generative model through two practical applications: editing the texture maps of 3D faces that have already been captured, and serving as a strong prior for face reconstruction when combined with differentiable rendering. Our model allows for the creation of physically-based relightable, editable faces with consistent topology and uv layout that can be integrated into traditional computer graphics pipelines.Attribution 4.0 International Licensebiophysical face synthesis, controllable 3D generation, skin appearance modeling, facial editing, diffusion modelsbiophysical face synthesiscontrollable 3D generationskin appearance modelingfacial editingdiffusion modelsControllable Biophysical Human Faces10.1111/cgf.7017013 pages