Open Access Te Herenga Waka-Victoria University of Wellington
Browse

Reconstructing Reflection Maps Using a Stacked-CNN for Mixed Reality Rendering

journal contribution
posted on 2021-11-30, 03:32 authored by Andrew ChalmersAndrew Chalmers, Junhong ZhaoJunhong Zhao, Daniel Medeiros, Taehyun Rhee
Corresponding lighting and reflectance between real and virtual objects is important for spatial presence in augmented and mixed reality (AR and MR) applications. We present a method to reconstruct real-world environmental lighting, encoded as a reflection map (RM), from a conventional photograph. To achieve this, we propose a stacked convolutional neural network (SCNN) that predicts high dynamic range (HDR) 360{}^\circ° RMs with varying roughness from a limited field of view, low dynamic range photograph. The SCNN is progressively trained from high to low roughness to predict RMs at varying roughness levels, where each roughness level corresponds to a virtual object's roughness (from diffuse to glossy) for rendering. The predicted RM provides high-fidelity rendering of virtual objects to match with the background photograph. We illustrate the use of our method with indoor and outdoor scenes trained on separate indoor/outdoor SCNNs showing plausible rendering and composition of virtual objects in AR/MR. We show that our method has improved quality over previous methods with a comparative user study and error metrics.

History

Preferred citation

Chalmers, A., Zhao, J., Medeiros, D. & Rhee, T. (2021). Reconstructing Reflection Maps Using a Stacked-CNN for Mixed Reality Rendering. IEEE Transactions on Visualization and Computer Graphics, 27(10), 4073-4084. https://doi.org/10.1109/TVCG.2020.3001917

Journal title

IEEE Transactions on Visualization and Computer Graphics

Volume

27

Issue

10

Publication date

2021-10-01

Pagination

4073-4084

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication status

Published

ISSN

1077-2626

eISSN

1941-0506

Language

en