He, ZhenGuo, JieZhang, YanTu, QinghaoChen, MufanGuo, YanwenWang, PengyuDai, WeiChaine, RaphaƫlleDeng, ZhigangKim, Min H.2023-10-092023-10-092023978-3-03868-234-9https://doi.org/10.2312/pg.20231275https://diglib.eg.org:443/handle/10.2312/pg20231275Specific materials are often associated with a certain type of objects in the real world. They simulate the way the surface of the object interacting with light and are named after that type of object. We observe that the text labels of materials contain advanced semantic information, which can be used as a guidance to assist the generation of specific materials. Based on that, we propose Text2Mat, a text-guided material generation framework. To meet the demand of material generation based on text descriptions, we construct a large set of PBR materials with specific text labels. Each material contains detailed text descriptions that match the visual appearance of the material. Furthermore, for the sake of controlling the texture and spatial layout of generated materials through text, we introduce texture attribute labels and extra attributes describing regular materials. Using this dataset, we train a specific neural network adapted from Stable Diffusion to achieve text-based material generation. Extensive experiments and rendering effects demonstrate that Text2Mat can generate materials with spatial layout and texture styles highly corresponding to text descriptions.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies -> RenderingComputing methodologiesRenderingText2Mat: Generating Materials from Text10.2312/pg.2023127589-979 pages