CEIG2021
Permanent URI for this collection
Browse
Browsing CEIG2021 by Author "Andujar, Carlos"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Digital Layered Models of Architecture and Mural Paintings over Time(The Eurographics Association, 2021) Guardia, Milagros; Pogliani, Paola; Bordi, Giulia; Charalambous, Panayiotis; Andujar, Carlos; Munoz-Pandiella, Imanol; Pueyo, Xavier; Ortega, Lidia M. and Chica, AntonioThe European project Enhancement of Heritage Experiences: The Middle Ages. Digital Layered Models of Architecture and Mural Paintings over Time (EHEM) aims to obtain virtual reconstructions of medieval artistic heritage -architecture with mural paintings- that are as close as possible to the original at different times, incorporating historical-artistic knowledge and the diachronic perspective of heritage. The project has also the purpose of incorporating not only how these painted buildings are and how they were, but also what function they had, how they were used and how they were perceived by the different users. EHEM will offer an instrument for researchers, restorers and heritage curators and will “humanize” the heritage proposing to the spectator of the 21st century an experience close to the users of the Middle Ages.Item Neural Colorization of Laser Scans(The Eurographics Association, 2021) Comino Trinidad, Marc; Andujar, Carlos; Bosch, Carles; Chica, Antonio; Muñoz-Pandiella, Imanol; Ortega, Lidia M. and Chica, AntonioLaser scanners enable the digitization of 3D surfaces by generating a point cloud where each point sample includes an intensity (infrared reflectivity) value. Some LiDAR scanners also incorporate cameras to capture the color of the surfaces visible from the scanner location. Getting usable colors everywhere across 360° scans is a challenging task, especially for indoor scenes. LiDAR scanners lack flashes, and placing proper light sources for a 360° indoor scene is either unfeasible or undesirable. As a result, color data from LiDAR scans often do not have an adequate quality, either because of poor exposition (too bright or too dark areas) or because of severe illumination changes between scans (e.g. direct Sunlight vs cloudy lighting). In this paper, we present a new method to recover plausible color data from the infrared data available in LiDAR scans. The main idea is to train an adapted image-to-image translation network using color and intensity values on well-exposed areas of scans. At inference time, the network is able to recover plausible color using exclusively the intensity values. The immediate application of our approach is the selective colorization of LiDAR data in those scans or regions with missing or poor color data.