Zhou, MengyuanYamaguchi, YasushiCabiddu, DanielaSchneider, TeseoAllegra, DarioCatalano, Chiara EvaCherchi, GianmarcoScateni, Riccardo2022-11-082022-11-082022978-3-03868-191-52617-4855https://doi.org/10.2312/stag.20221269https://diglib.eg.org:443/handle/10.2312/stag20221269The recent studies on GAN achieved impressive results in image synthesis. However, they are still not so perfect that output images may contain unnatural regions. We propose a tuning method for generator networks trained by GAN to improve their results by interactively removing unexpected objects and textures or changing the object colors. Our method could find and ablate those units in the generator networks that are highly related to the specific regions or their colors. Compared to the related studies, our proposed method can tune pre-trained generator networks without relying on any additional information like segmentation-based networks. We built the interactive system based on our method, capable of tuning the generator networks to make the resulting images as expected. The experiments show that our method could remove only unexpected objects and textures. It could change the selected area color as well. The method also gives us some hints to discuss the properties of generator networks which layers and units are associated with objects, textures, or colors.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies -> Image processing; Neural networks; Human-centered computing -> Empirical studies in interaction designComputing methodologiesImage processingNeural networksHumancentered computingEmpirical studies in interaction designAn Interactive Tuning Method for Generator Networks Trained by GAN10.2312/stag.20221269151-16010 pages