Show simple item record

dc.contributor.authorPalma, Gianpaoloen_US
dc.contributor.authorCignoni, Paoloen_US
dc.contributor.authorBoubekeur, Tamyen_US
dc.contributor.authorScopigno, Robertoen_US
dc.contributor.editorChen, Min and Zhang, Hao (Richard)en_US
dc.date.accessioned2016-09-27T10:02:02Z
dc.date.available2016-09-27T10:02:02Z
dc.date.issued2016
dc.identifier.issn1467-8659
dc.identifier.urihttp://dx.doi.org/10.1111/cgf.12730
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf12730
dc.description.abstractDetecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no‐change regions. Such application scenarios include 3D surface reconstruction, environment monitoring, natural events management and forensic science. Unfortunately, typical 3D scanning setups cannot provide any one‐to‐one mapping between measured samples in static regions: in particular, both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi‐scale approach to robustly tackle these issues. Starting from two point clouds, we first remove outliers using a probabilistic operator. Then, we detect the actual change using the implicit surface defined by the point clouds under a Growing Least Square reconstruction that, compared to the classical proximity measure, offers a more robust change/no‐change characterization near the temporal intersection of the scans and in the areas exhibiting different sampling density and direction. The resulting classification is enhanced with a spatial reasoning step to solve critical geometric configurations that are common in man‐made environments. We validate our approach on a synthetic test case and on a collection of real data sets acquired using commodity hardware. Finally, we show how 3D reconstruction benefits from the resulting precise change/no‐change segmentation.Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no‐change regions. Unfortunately, typical 3D scanning setups cannot provide any oneto‐one mapping between measured samples in static regions: both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi‐scale approach to robustly tackle these issues, obtaining a robust segmentation near the temporal intersection of the scans and in the areas with different sampling density and direction.en_US
dc.publisherCopyright © 2016 The Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectpoint‐based methods
dc.subjectdigital geometry processing
dc.subjectobject scanning/acquisition
dc.subject[Computer Graphics]: Shape modelling—Point‐based models
dc.titleDetection of Geometric Temporal Changes in Point Cloudsen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersArticles
dc.description.volume35
dc.description.number6
dc.identifier.doi10.1111/cgf.12730
dc.identifier.pages33-45


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record