Song, DanTong, RuofengChang, JianYang, XiaosongTang, MinZhang, Jian JunEitan Grinspun and Bernd Bickel and Yoshinori Dobashi2016-10-112016-10-1120161467-8659https://doi.org/10.1111/cgf.13012https://diglib.eg.org:443/handle/10.1111/cgf13012Estimation of 3D body shapes from dressed-human photos is an important but challenging problem in virtual fitting. We propose a novel automatic framework to efficiently estimate 3D body shapes under clothes. We construct a database of 3D naked and dressed body pairs, based on which we learn how to predict 3D positions of body landmarks (which further constrain a parametric human body model) automatically according to dressed-human silhouettes. Critical vertices are selected on 3D registered human bodies as landmarks to represent body shapes, so as to avoid the time-consuming vertices correspondences finding process for parametric body reconstruction. Our method can estimate 3D body shapes from dressed-human silhouettes within 4 seconds, while the fastest method reported previously need 1 minute. In addition, our estimation error is within the size tolerance for clothing industry. We dress 6042 naked bodies with 3 sets of common clothes by physically based cloth simulation technique. To the best of our knowledge, We are the first to construct such a database containing 3D naked and dressed body pairs and our database may contribute to the areas of human body shapes estimation and cloth simulation.I.3.m [Computer Graphics]MiscellaneousImagebased modeling3D Body Shapes Estimation from Dressed-Human Silhouettes10.1111/cgf.13012147-156