Giunchi, DanieleJames, StuartSteed, AnthonyAydın, Tunç and Sýkora, Daniel2018-11-102018-11-102018978-1-4503-5892-72079-8679https://doi.org/10.1145/3229147.3229166https://diglib.eg.org:443/handle/10.1145/3229147-3229166We describe a novel method for searching 3D model collections using free-form sketches within a virtual environment as queries. As opposed to traditional sketch retrieval, our queries are drawn directly onto an example model. Using immersive virtual reality the user can express their query through a sketch that demonstrates the desired structure, color and texture. Unlike previous sketch-based retrieval methods, users remain immersed within the environment without relying on textual queries or 2D projections which can disconnect the user from the environment. We perform a test using queries over several descriptors, evaluating the precision in order to select the most accurate one. We show how a convolutional neural network (CNN) can create multi-view representations of colored 3D sketches. Using such a descriptor representation, our system is able to rapidly retrieve models and in this way, we provide the user with an interactive method of navigating large object datasets. Through a user study we demonstrate that by using our VR 3D model retrieval system, users can perform search more quickly and intuitively than with a naive linear browsing method. Using our system users can rapidly populate a virtual environment with specific models from a very large database, and thus the technique has the potential to be broadly applicable in immersive editing systems.3D Sketching for Interactive Model Retrieval in Virtual Reality10.1145/3229147.3229166