File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICCV51070.2023.00301
- Scopus: eid_2-s2.0-85169019940
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: AssetField: Assets Mining and Reconfiguration in Ground Feature Plane Representation
Title | AssetField: Assets Mining and Reconfiguration in Ground Feature Plane Representation |
---|---|
Authors | |
Issue Date | 2023 |
Citation | Proceedings of the IEEE International Conference on Computer Vision, 2023, p. 3228-3238 How to Cite? |
Abstract | Both indoor and outdoor environments are inherently structured and repetitive. Traditional modeling pipelines keep an asset library storing unique object templates, which is both versatile and memory efficient in practice. Inspired by this observation, we propose AssetField, a novel neural scene representation that learns a set of object-aware ground feature planes to represent the scene, where an asset library storing template feature patches can be constructed in an unsupervised manner. Unlike existing methods which require object masks to query spatial points for object editing, our ground feature plane representation offers a natural visualization of the scene in the bird-eye view, allowing a variety of operations (e.g. translation, duplication, deformation) on objects to configure a new scene. With the template feature patches, group editing is enabled for scenes with many recurring items to avoid repetitive work on object individuals. We show that AssetField not only achieves competitive performance for novel-view synthesis but also generates realistic renderings for new scene configurations. |
Persistent Identifier | http://hdl.handle.net/10722/352380 |
ISSN | 2023 SCImago Journal Rankings: 12.263 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xiangli, Yuanbo | - |
dc.contributor.author | Xu, Linning | - |
dc.contributor.author | Pan, Xingang | - |
dc.contributor.author | Zhao, Nanxuan | - |
dc.contributor.author | Dai, Bo | - |
dc.contributor.author | Lin, Dahua | - |
dc.date.accessioned | 2024-12-16T03:58:34Z | - |
dc.date.available | 2024-12-16T03:58:34Z | - |
dc.date.issued | 2023 | - |
dc.identifier.citation | Proceedings of the IEEE International Conference on Computer Vision, 2023, p. 3228-3238 | - |
dc.identifier.issn | 1550-5499 | - |
dc.identifier.uri | http://hdl.handle.net/10722/352380 | - |
dc.description.abstract | Both indoor and outdoor environments are inherently structured and repetitive. Traditional modeling pipelines keep an asset library storing unique object templates, which is both versatile and memory efficient in practice. Inspired by this observation, we propose AssetField, a novel neural scene representation that learns a set of object-aware ground feature planes to represent the scene, where an asset library storing template feature patches can be constructed in an unsupervised manner. Unlike existing methods which require object masks to query spatial points for object editing, our ground feature plane representation offers a natural visualization of the scene in the bird-eye view, allowing a variety of operations (e.g. translation, duplication, deformation) on objects to configure a new scene. With the template feature patches, group editing is enabled for scenes with many recurring items to avoid repetitive work on object individuals. We show that AssetField not only achieves competitive performance for novel-view synthesis but also generates realistic renderings for new scene configurations. | - |
dc.language | eng | - |
dc.relation.ispartof | Proceedings of the IEEE International Conference on Computer Vision | - |
dc.title | AssetField: Assets Mining and Reconfiguration in Ground Feature Plane Representation | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/ICCV51070.2023.00301 | - |
dc.identifier.scopus | eid_2-s2.0-85169019940 | - |
dc.identifier.spage | 3228 | - |
dc.identifier.epage | 3238 | - |