Tour the World From Your Couch: Google ‘NeRF-W’ Delivers Accurate 3D Scene Reconstruction of Complex Outdoor Environments

Synced
SyncedReview
Published in
3 min readAug 10, 2020

Google researchers have introduced a series of extensions to the SOTA view-synthesis method Neural Radiance Fields (NeRF) that enable it to produce high-quality 3D representations of complex scenes with only unstructured image collections as input. The approach improves NeRF’s ability to model common real-world phenomena such as variable illumination conditions or transient occluders found in such uncontrolled images.

Proposed in March by researchers from UC Berkeley, Google, and UC San Diego, NeRF implicitly models the radiance field and density of a scene within the weights of a neural network and then uses direct volume rendering to synthesize new views. While it has achieved an unprecedented level of fidelity across a range of challenging scenes, NeRF is only effective in highly controlled settings: the scene has to be captured within a short time frame during which lighting effects remain constant and all scene content is static.

The Google researchers propose NeRF in the Wild (NeRF-W), a novel approach for 3D scene reconstruction of complex outdoor environments from in-the-wild photo collections. NeRF-W is built on NeRF with two enhancements explicitly designed to handle challenges particular to unconstrained imagery.

The researchers explain that NeRF-W is able to learn a per-image latent embedding capturing photometric appearance variations often present in in-the-wild data. Scenes are decomposed into shared and image-dependent components, enabling the approach to isolate transient elements from a static scene.

The researchers applied NeRF-W to Internet photo collections of famous global landmarks and found it capable of producing photorealistic, spatially consistent scene representations despite unknown or confounding factors. They demonstrated detailed, high-fidelity renderings from novel viewpoints as well as smooth appearance interpolation and 3D consistency in rendered videos.

The team evaluated the proposed novel view synthesis method against Google and the University of Maryland’s 2019 Neural Rerendering in the Wild (NRW), the original NeRF, and two ablations of NeRF-W: NeRF-A (appearance) and NeRF-U (uncertainty). (The proposed NeRF-W includes both NeRF-A and NeRF-U.) Experimental evaluation on both synthetic and real-world data demonstrated significant qualitative and quantitative improvements over other approaches across all considered metrics.

While NeRF-W is able to produce photorealistic and temporally consistent renderings from unstructured photographs, the quality of the renderings degrades in scene areas that are rarely covered in the captured images, the researchers noted. Also, like NeRF, NeRF-W is also sensitive to camera calibration errors, which may lead to overly blurry reconstructions on certain parts of a scene.

While their work “accomplishes significant strides towards generating novel views in outdoor unconstrained environments,” the researchers say the problem of outdoor scene reconstruction from image data remains far from fully solved.

The paper NeRF in the Wild: Neural Radiance Fields for Unconstrained Photo Collections is on arXiv. Additional information and examples are available on the GitHub project page.

Reporter: Yuan Yuan | Editor: Michael Sarazen

Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how China has leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle. Along with this report, we also introduced a database covering additional 1428 artificial intelligence solutions from 12 pandemic scenarios.

Click here to find more reports from us.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global