Automatic Selection of Video Frames for Path Regularization and 3D Reconstruction

Loading...
Thumbnail Image
Date
2016
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Video sequences can be a valuable source to document the state of objects and sites. They are easy to acquire and they usually ensure a complete coverage of the object of interest. One of their possible uses is to recover the acquisition path, or the 3D shape of the scene. This can be done by applying structurefrom- motion techniques to a representative set of frames extracted from the video. This paper presents an automatic method for the extraction of a predefined number of representative frames that ensures an accurate reconstruction of the sequence path, and possibly enhances the 3D reconstruction of the scene. The automatic extraction is obtained by analyzing adjacent frames in a starting subset, and adding/removing frames so that the distance between them remains constant. This ensures the reconstruction of a regularized path and an optimized coverage of all the scene. Finally, more frames are added in the portions of the sequence when more detailed objects are framed. This ensures a better description of the sequence, and a more accurate dense reconstruction. The method is automatic, fast and independent from any assumption about the acquired object or the acquisition strategy. It was tested on a variety of different video sequences, showing that a satisfying result can be obtained regardless of the length and quality of the input.
Description

        
@inproceedings{
10.2312:gch.20161376
, booktitle = {
Eurographics Workshop on Graphics and Cultural Heritage
}, editor = {
Chiara Eva Catalano and Livio De Luca
}, title = {{
Automatic Selection of Video Frames for Path Regularization and 3D Reconstruction
}}, author = {
Pavoni, Gaia
and
Dellepiane, Matteo
and
Callieri, Marco
and
Scopigno, Roberto
}, year = {
2016
}, publisher = {
The Eurographics Association
}, ISSN = {
2312-6124
}, ISBN = {
978-3-03868-011-6
}, DOI = {
10.2312/gch.20161376
} }
Citation