Li, HaoGuan, ZhongyueWang, ZeyuHu, RuizhenCharalambous, Panayiotis2024-04-302024-04-302024978-3-03868-237-01017-4656https://doi.org/10.2312/egs.20241024https://diglib.eg.org/handle/10.2312/egs20241024Stylized brush strokes are crucial for digital artists to create drawings that express a desired artistic style. To obtain the ideal brush, artists need to spend much time manually tuning parameters and creating customized brushes, which hinders the completion, redrawing, or modification of digital drawings. This paper proposes an inverse procedural modeling pipeline for predicting brush parameters and rendering stylized strokes given a single sample drawing. Our pipeline involves patch segmentation as a preprocessing step, parameter prediction based on deep learning, and brush generation using a procedural rendering engine. Our method enhances the overall experience of digital drawing recreation by empowering artists with more intuitive control and consistent brush effects.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Image-based rendering; Machine learning; Applied computing → Media artsComputing methodologies → Imagebased renderingMachine learningApplied computing → Media artsAn Inverse Procedural Modeling Pipeline for Stylized Brush Stroke Rendering10.2312/egs.202410244 pages