Petikam, LohitChalmers, AndrewAnjyo, KenRhee, TaehyunLee, Sung-Hee and Zollmann, Stefanie and Okabe, Makoto and Wünsche, Burkhard2021-10-142021-10-142021978-3-03868-162-5https://doi.org/10.2312/pg.20211386https://diglib.eg.org:443/handle/10.2312/pg20211386In look development, environment maps (EMs) are used to verify 3D appearance in varied lighting (e.g., overcast, sunny, and indoor). Artists can only assign one fixed material, making it laborious to edit appearance uniquely for all EMs. Artists can artdirect material and lighting in film post-production. However, this is impossible in dynamic real-time games and live augmented reality (AR), where environment lighting is unpredictable. We present a new workflow to customize appearance variation across a wide range of EM lighting, for live applications. Appearance edits can be predefined, and then automatically adapted to environment lighting changes. We achieve this by learning a novel 2D latent space of varied EM lighting. The latent space lets artists browse EMs in a semantically meaningful 2D view. For different EMs, artists can paint different material and lighting parameter values directly on the latent space. We robustly encode new EMs into the same space, for automatic look-up of the desired appearance. This solves a new problem of preserving art-direction in live applications, without any artist intervention.Computing methodologiesDimensionality reduction and manifold learningRenderingArt-directing Appearance using an Environment Map Latent Space10.2312/pg.2021138643-48