Deep Flow Rendering: View Synthesis via Layer-aware Reflection Flow

Loading...
Thumbnail Image
Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Novel view synthesis (NVS) generates images from unseen viewpoints based on a set of input images. It is a challenge because of inaccurate lighting optimization and geometry inference. Although current neural rendering methods have made significant progress, they still struggle to reconstruct global illumination effects like reflections and exhibit ambiguous blurs in highly viewdependent areas. This work addresses high-quality view synthesis to emphasize reflection on non-concave surfaces. We propose Deep Flow Rendering that optimizes direct and indirect lighting separately, leveraging texture mapping, appearance flow, and neural rendering. A learnable texture is used to predict view-independent features, meanwhile enabling efficient reflection extraction. To accurately fit view-dependent effects, we adopt a constrained neural flow to transfer image-space features from nearby views to the target view in an edge-preserving manner. Then we further implement a fusing renderer that utilizes the predictions of both layers to form the output image. The experiments demonstrate that our method outperforms the state-of-theart methods at synthesizing various scenes with challenging reflection effects.
Description

CCS Concepts: Computing methodologies --> Image-based rendering; Neural networks

        
@article{
10.1111:cgf.14593
, journal = {Computer Graphics Forum}, title = {{
Deep Flow Rendering: View Synthesis via Layer-aware Reflection Flow
}}, author = {
Dai, Pinxuan
and
Xie, Ning
}, year = {
2022
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.14593
} }
Citation
Collections