Fondevilla, AmelieRohmer, DamienHahmann, StefanieBousseau, AdrienCani, Marie‐PauleBenes, Bedrich and Hauser, Helwig2021-10-082021-10-0820211467-8659https://doi.org/10.1111/cgf.14390https://diglib.eg.org:443/handle/10.1111/cgf14390Fashion design often starts with hand‐drawn, expressive sketches that communicate the essence of a garment over idealized human bodies. We propose an approach to automatically dress virtual characters from such input, previously complemented with user‐annotations. In contrast to prior work requiring users to draw garments with accurate proportions over each virtual character to be dressed, our method follows a style transfer strategy : the information extracted from a single, annotated fashion sketch can be used to inform the synthesis of one to many new garment(s) with similar style, yet different proportions. In particular, we define the style of a loose garment from its silhouette and folds, which we extract from the drawing. Key to our method is our strategy to extract both shape and repetitive patterns of folds from the 2D input. As our results show, each input sketch can be used to dress a variety of characters of different morphologies, from virtual humans to cartoon‐style characters.cloth modellingmodellinggeometric modellingmodellingFashion Transfer: Dressing 3D Characters from Stylized Fashion Sketches10.1111/cgf.14390466-483