May, StefanKoch, PhilippKoch, RainerMerkl, ChristianPfitzner, ChristianNüchter, AndreasJan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban2014-12-162014-12-162014978-3-905674-74-3https://doi.org/10.2312/vmv.20141281This paper describes a data integration approach for arbitrary 2D/3D depth sensing units exploiting assets of the signed distance function. The underlying framework generalizes the KinectFusion approach with an objectoriented model respecting different sensor modalities. For instance, measurements of 2D/3D laser range finders and RGB-D cameras can be integrated into the same representation. Exemplary, an environment is reconstructed with a 3D laser range finder, while adding fine details from objects of interest by closer inspection with an RGB-D sensor. A typical application of this approach is the exploration in rescue environments, where large-scale mapping is performed on the basis of long-range laser range finders while hollows are inspected with lightweight sensors attached to a manipulator arm.A Generalized 2D and 3D Multi-Sensor Data Integration Approach based on Signed Distance Functions for Multi-Modal Robotic Mapping