Sun, HaoranWang, ShiyiWu, WenhaiJin, YaoBao, HujunHuang, JinUmetani, NobuyukiWojtan, ChrisVouga, Etienne2022-10-042022-10-0420221467-8659https://doi.org/10.1111/cgf.14696https://diglib.eg.org:443/handle/10.1111/cgf14696Texture mapping is a ubiquitous technique to enrich the visual effect of a mesh, which represents the desired signal (e.g. diffuse color) on the mesh to a texture image discretized by pixels through a bijective parameterization. To achieve high visual quality, large number of pixels are generally required, which brings big burden in storage, memory and transmission. We propose to use a perceptual model and a rendering procedure to measure the loss coming from the discretization, then optimize a parameterization to improve the efficiency, i.e. using fewer pixels under a comparable perceptual loss. The general perceptual model and rendering procedure can be very complicated, and non-isotropic property rooted in the square shape of pixels make the problem more difficult to solve. We adopt a two-stage strategy and use the Bayesian optimization in the triangle-wise stage. With our carefully designed weighting scheme, the mesh-wise optimization can take the triangle-wise perceptual loss into consideration under a global conforming requirement. Comparing with many parameterizations manually designed, driven by interpolation error, or driven by isotropic energy, ours can use significantly fewer pixels with comparable perception loss or vise vesa.Keywords: Geometric Modeling, Surface Parameterization, Texture Mapping, Perceptual Loss CCS Concepts: Computing methodologies → Shape modelingGeometric ModelingSurface ParameterizationTexture MappingPerceptual Loss CCS ConceptsComputing methodologies → Shape modelingEfficient Texture Parameterization Driven by Perceptual-Loss-on-Screen10.1111/cgf.14696507-51812 pages