Pena-Pena, KareliaArce, Gonzalo R.Ghosh, AbhijeetWei, Li-Yi2022-07-012022-07-012022978-3-03868-187-81727-3463https://doi.org/10.2312/sr.20221160https://diglib.eg.org:443/handle/10.2312/sr20221160Stippling illustrations of CEOs, authors, and world leaders have become an iconic style. Dot after dot is meticulously placed by professional artists to complete a hedcut, being an extremely time-consuming and painstaking task. The automatic generation of hedcuts by a computer is not simple since the understanding of the structure of faces and binary rendering of illustrations must be captured by an algorithm. Current challenges relate to the shape and placement of the dots without generating unwanted regularity artifacts. Recent neural style transfer techniques successfully separate the style from the content information of an image. However, such approach, as it is, is not suitable for stippling rendering since its output suffers from spillover artifacts and the placement of dots is arbitrary. The lack of aligned training data pairs also constraints the use of other deep-learning-based techniques. To address these challenges, we propose a new neural-based style transfer algorithm that uses side information to impose additional constraints on the direction of the dots. Experimental results show significant improvement in rendering hedcuts.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies --> Non-photorealistic renderingComputing methodologiesNonphotorealistic renderingHedcutDrawings: Rendering Hedcut Style Portraits10.2312/sr.20221160107-1159 pages