Wang, ChaoWolski, KrzysztofKerbl, BernhardSerrano, AnaBemama, MojtabaSeidel, Hans-PeterMyszkowski, KarolLeimkühler, ThomasChen, RenjieRitschel, TobiasWhiting, Emily2024-10-132024-10-1320241467-8659https://doi.org/10.1111/cgf.15214https://diglib.eg.org/handle/10.1111/cgf15214Radiance field methods represent the state of the art in reconstructing complex scenes from multi-view photos. However, these reconstructions often suffer from one or both of the following limitations: First, they typically represent scenes in low dynamic range (LDR), which restricts their use to evenly lit environments and hinders immersive viewing experiences. Secondly, their reliance on a pinhole camera model, assuming all scene elements are in focus in the input images, presents practical challenges and complicates refocusing during novel-view synthesis. Addressing these limitations, we present a lightweight method based on 3D Gaussian Splatting that utilizes multi-view LDR images of a scene with varying exposure times, apertures, and focus distances as input to reconstruct a high-dynamic-range (HDR) radiance field. By incorporating analytical convolutions of Gaussians based on a thin-lens camera model as well as a tonemapping module, our reconstructions enable the rendering of HDR content with flexible refocusing capabilities. We demonstrate that our combined treatment of HDR and depth of field facilitates real-time cinematic rendering, outperforming the state of the art.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Computational photography; Image-based renderingComputing methodologies → Computational photographyImagebased renderingCinematic Gaussians: Real-Time HDR Radiance Fields with Depth of Field10.1111/cgf.1521413 pages