Luan, FujunZhao, ShuangBala, KavitaZwicker, Matthias and Sander, Pedro2017-06-192017-06-1920171467-8659https://doi.org/10.1111/cgf.13230https://diglib.eg.org:443/handle/10.1111/cgf13230Procedural textile models are compact, easy to edit, and can achieve state-of-the-art realism with fiber-level details. However, these complex models generally need to be fully instantiated (aka. realized) into 3D volumes or fiber meshes and stored in memory, We introduce a novel realization-minimizing technique that enables physically based rendering of procedural textiles, without the need of full model realizations. The key ingredients of our technique are new data structures and search algorithms that look up regular and flyaway fibers on the fly, efficiently and consistently. Our technique works with compact fiber-level procedural yarn models in their exact form with no approximation imposed. In practice, our method can render very large models that are practically unrenderable using existing methods, while using considerably less memory (60-200 less) and achieving good performance.Computing methodologies> RenderingFiber-Level On-the-Fly Procedural Textiles10.1111/cgf.13230123-135