File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: F2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories

TitleF2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories
Authors
Issue Date22-Jun-2023
Abstract

This paper presents a novel grid-based NeRF called F 2 - NeRF (Fast-Free-NeRF) for novel view synthesis, which enables arbitrary input camera trajectories and only costs a few minutes for training. Existing fast grid-based NeRF training frameworks, like Instant-NGP, Plenoxels, DVGO, or TensoRF, are mainly designed for bounded scenes and rely on space warping to handle unbounded scenes. Existing two widely-used space-warping methods are only designed for the forward-facing trajectory or the 360° object-centric trajectory but cannot process arbitrary trajectories. In this paper, we delve deep into the mechanism of space warping to handle unbounded scenes. Based on our analysis, we further propose a novel space-warping method called perspective warping, which allows us to handle arbitrary trajectories in the grid-based NeRF framework. Extensive experiments demonstrate that F 2 -NeRF is able to use the same perspective warping to render high-quality images on two standard datasets and a new free trajectory dataset collected by us. Project page: totoro97.github.io/projects/f2-nerf.


Persistent Identifierhttp://hdl.handle.net/10722/337675

 

DC FieldValueLanguage
dc.contributor.authorWang, Peng-
dc.contributor.authorLiu, Yuan-
dc.contributor.authorChen, Zhaoxi-
dc.contributor.authorLiu, Lingjie-
dc.contributor.authorLiu, Ziwei-
dc.contributor.authorKomura, Taku-
dc.contributor.authorTheobalt, Christian-
dc.contributor.authorWang, Wenping-
dc.date.accessioned2024-03-11T10:23:00Z-
dc.date.available2024-03-11T10:23:00Z-
dc.date.issued2023-06-22-
dc.identifier.urihttp://hdl.handle.net/10722/337675-
dc.description.abstract<p>This paper presents a novel grid-based NeRF called F <sup>2</sup> - NeRF (Fast-Free-NeRF) for novel view synthesis, which enables arbitrary input camera trajectories and only costs a few minutes for training. Existing fast grid-based NeRF training frameworks, like Instant-NGP, Plenoxels, DVGO, or TensoRF, are mainly designed for bounded scenes and rely on space warping to handle unbounded scenes. Existing two widely-used space-warping methods are only designed for the forward-facing trajectory or the 360° object-centric trajectory but cannot process arbitrary trajectories. In this paper, we delve deep into the mechanism of space warping to handle unbounded scenes. Based on our analysis, we further propose a novel space-warping method called perspective warping, which allows us to handle arbitrary trajectories in the grid-based NeRF framework. Extensive experiments demonstrate that F <sup>2</sup> -NeRF is able to use the same perspective warping to render high-quality images on two standard datasets and a new free trajectory dataset collected by us. Project page: totoro97.github.io/projects/f2-nerf.<br></p>-
dc.languageeng-
dc.relation.ispartofThe IEEE / CVF Computer Vision and Pattern Recognition Conference (CVPR) (18/06/2023-22/06/2023, Vancouver)-
dc.titleF2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories-
dc.typeConference_Paper-
dc.description.naturepreprint-
dc.identifier.doi10.1109/CVPR52729.2023.00404-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats