File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1007/978-3-031-73113-6_27
- Scopus: eid_2-s2.0-85210863353
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: RoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting
Title | RoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting |
---|---|
Authors | |
Keywords | Scene Generation Scene Texturing Texture Synthesis |
Issue Date | 2025 |
Citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2025, v. 15126 LNCS, p. 465-482 How to Cite? |
Abstract | The advancement of diffusion models has pushed the boundary of text-to-3D object generation. While it is straightforward to composite objects into a scene with reasonable geometry, it is nontrivial to texture such a scene perfectly due to style inconsistency and occlusions between objects. To tackle these problems, we propose a coarse-to-fine 3D scene texturing framework, referred to as RoomTex, to generate high-fidelity and style-consistent textures for untextured compositional scene meshes. In the coarse stage, RoomTex first unwraps the scene mesh to a panoramic depth map and leverages ControlNet to generate a room panorama, which is regarded as the coarse reference to ensure the global texture consistency. In the fine stage, based on the panoramic image and perspective depth maps, RoomTex will refine and texture every single object in the room iteratively along a series of selected camera views, until this object is completely painted. Moreover, we propose to maintain superior alignment between RGB and depth spaces via subtle edge detection methods. Extensive experiments show our method is capable of generating high-quality and diverse room textures, and more importantly, supporting interactive fine-grained texture control and flexible scene editing thanks to our inpainting-based framework and compositional mesh input. Our project page is available at https://qwang666.github.io/RoomTex/. |
Persistent Identifier | http://hdl.handle.net/10722/352491 |
ISSN | 2023 SCImago Journal Rankings: 0.606 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, Qi | - |
dc.contributor.author | Lu, Ruijie | - |
dc.contributor.author | Xu, Xudong | - |
dc.contributor.author | Wang, Jingbo | - |
dc.contributor.author | Wang, Michael Yu | - |
dc.contributor.author | Dai, Bo | - |
dc.contributor.author | Zeng, Gang | - |
dc.contributor.author | Xu, Dan | - |
dc.date.accessioned | 2024-12-16T03:59:26Z | - |
dc.date.available | 2024-12-16T03:59:26Z | - |
dc.date.issued | 2025 | - |
dc.identifier.citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2025, v. 15126 LNCS, p. 465-482 | - |
dc.identifier.issn | 0302-9743 | - |
dc.identifier.uri | http://hdl.handle.net/10722/352491 | - |
dc.description.abstract | The advancement of diffusion models has pushed the boundary of text-to-3D object generation. While it is straightforward to composite objects into a scene with reasonable geometry, it is nontrivial to texture such a scene perfectly due to style inconsistency and occlusions between objects. To tackle these problems, we propose a coarse-to-fine 3D scene texturing framework, referred to as RoomTex, to generate high-fidelity and style-consistent textures for untextured compositional scene meshes. In the coarse stage, RoomTex first unwraps the scene mesh to a panoramic depth map and leverages ControlNet to generate a room panorama, which is regarded as the coarse reference to ensure the global texture consistency. In the fine stage, based on the panoramic image and perspective depth maps, RoomTex will refine and texture every single object in the room iteratively along a series of selected camera views, until this object is completely painted. Moreover, we propose to maintain superior alignment between RGB and depth spaces via subtle edge detection methods. Extensive experiments show our method is capable of generating high-quality and diverse room textures, and more importantly, supporting interactive fine-grained texture control and flexible scene editing thanks to our inpainting-based framework and compositional mesh input. Our project page is available at https://qwang666.github.io/RoomTex/. | - |
dc.language | eng | - |
dc.relation.ispartof | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | - |
dc.subject | Scene Generation | - |
dc.subject | Scene Texturing | - |
dc.subject | Texture Synthesis | - |
dc.title | RoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1007/978-3-031-73113-6_27 | - |
dc.identifier.scopus | eid_2-s2.0-85210863353 | - |
dc.identifier.volume | 15126 LNCS | - |
dc.identifier.spage | 465 | - |
dc.identifier.epage | 482 | - |
dc.identifier.eissn | 1611-3349 | - |