CEIG2021
Permanent URI for this collection
Browse
Browsing CEIG2021 by Author "Chica, Antonio"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item CEIG 2021: Frontmatter(Eurographics Association, 2021) Ortega, Lidia M.; Chica, Antonio; Ortega, Lidia M. and Chica, AntonioItem Intensity-Guided Exposure Correction for Indoor LiDAR Scans(The Eurographics Association, 2021) Comino Trinidad, Marc; Andújar, Carlos; Bosch, Carles; Chica, Antonio; Munoz-Pandiella, Imanol; Ortega, Lidia M. and Chica, AntonioTerrestrial Laser Scanners, also known as LiDAR, are often equipped with color cameras so that both infrared and RGB values are measured for each point sample. High-end scanners also provide panoramic High Dynamic Range (HDR) images. Rendering such HDR colors on conventional displays requires a tone-mapping operator, and getting a suitable exposure everywhere on the image can be challenging for 360° indoor scenes with a variety of rooms and illumination sources. In this paper we present a simple-to-implement tone mapping algorithm for HDR panoramas captured by LiDAR equipment. The key idea is to choose, on a per-pixel basis, an exposure correction factor based on the local intensity (infrared reflectivity). Since LiDAR intensity values for indoor scenes are nearly independent from the external illumination, we show that intensity-guided exposure correction often outperforms state-of-the-art tone-mapping operators on this kind of scenes.Item Neural Colorization of Laser Scans(The Eurographics Association, 2021) Comino Trinidad, Marc; Andujar, Carlos; Bosch, Carles; Chica, Antonio; Muñoz-Pandiella, Imanol; Ortega, Lidia M. and Chica, AntonioLaser scanners enable the digitization of 3D surfaces by generating a point cloud where each point sample includes an intensity (infrared reflectivity) value. Some LiDAR scanners also incorporate cameras to capture the color of the surfaces visible from the scanner location. Getting usable colors everywhere across 360° scans is a challenging task, especially for indoor scenes. LiDAR scanners lack flashes, and placing proper light sources for a 360° indoor scene is either unfeasible or undesirable. As a result, color data from LiDAR scans often do not have an adequate quality, either because of poor exposition (too bright or too dark areas) or because of severe illumination changes between scans (e.g. direct Sunlight vs cloudy lighting). In this paper, we present a new method to recover plausible color data from the infrared data available in LiDAR scans. The main idea is to train an adapted image-to-image translation network using color and intensity values on well-exposed areas of scans. At inference time, the network is able to recover plausible color using exclusively the intensity values. The immediate application of our approach is the selective colorization of LiDAR data in those scans or regions with missing or poor color data.