File Download
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: F2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories
Title | F2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories |
---|---|
Authors | |
Issue Date | 22-Jun-2023 |
Abstract | This paper presents a novel grid-based NeRF called F 2 - NeRF (Fast-Free-NeRF) for novel view synthesis, which enables arbitrary input camera trajectories and only costs a few minutes for training. Existing fast grid-based NeRF training frameworks, like Instant-NGP, Plenoxels, DVGO, or TensoRF, are mainly designed for bounded scenes and rely on space warping to handle unbounded scenes. Existing two widely-used space-warping methods are only designed for the forward-facing trajectory or the 360° object-centric trajectory but cannot process arbitrary trajectories. In this paper, we delve deep into the mechanism of space warping to handle unbounded scenes. Based on our analysis, we further propose a novel space-warping method called perspective warping, which allows us to handle arbitrary trajectories in the grid-based NeRF framework. Extensive experiments demonstrate that F 2 -NeRF is able to use the same perspective warping to render high-quality images on two standard datasets and a new free trajectory dataset collected by us. Project page: totoro97.github.io/projects/f2-nerf. |
Persistent Identifier | http://hdl.handle.net/10722/337675 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, Peng | - |
dc.contributor.author | Liu, Yuan | - |
dc.contributor.author | Chen, Zhaoxi | - |
dc.contributor.author | Liu, Lingjie | - |
dc.contributor.author | Liu, Ziwei | - |
dc.contributor.author | Komura, Taku | - |
dc.contributor.author | Theobalt, Christian | - |
dc.contributor.author | Wang, Wenping | - |
dc.date.accessioned | 2024-03-11T10:23:00Z | - |
dc.date.available | 2024-03-11T10:23:00Z | - |
dc.date.issued | 2023-06-22 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337675 | - |
dc.description.abstract | <p>This paper presents a novel grid-based NeRF called F <sup>2</sup> - NeRF (Fast-Free-NeRF) for novel view synthesis, which enables arbitrary input camera trajectories and only costs a few minutes for training. Existing fast grid-based NeRF training frameworks, like Instant-NGP, Plenoxels, DVGO, or TensoRF, are mainly designed for bounded scenes and rely on space warping to handle unbounded scenes. Existing two widely-used space-warping methods are only designed for the forward-facing trajectory or the 360° object-centric trajectory but cannot process arbitrary trajectories. In this paper, we delve deep into the mechanism of space warping to handle unbounded scenes. Based on our analysis, we further propose a novel space-warping method called perspective warping, which allows us to handle arbitrary trajectories in the grid-based NeRF framework. Extensive experiments demonstrate that F <sup>2</sup> -NeRF is able to use the same perspective warping to render high-quality images on two standard datasets and a new free trajectory dataset collected by us. Project page: totoro97.github.io/projects/f2-nerf.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | The IEEE / CVF Computer Vision and Pattern Recognition Conference (CVPR) (18/06/2023-22/06/2023, Vancouver) | - |
dc.title | F2-NeRF: Fast Neural Radiance Field Training with Free Camera Trajectories | - |
dc.type | Conference_Paper | - |
dc.description.nature | preprint | - |
dc.identifier.doi | 10.1109/CVPR52729.2023.00404 | - |