File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: RoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting

TitleRoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting
Authors
KeywordsScene Generation
Scene Texturing
Texture Synthesis
Issue Date2025
Citation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2025, v. 15126 LNCS, p. 465-482 How to Cite?
AbstractThe advancement of diffusion models has pushed the boundary of text-to-3D object generation. While it is straightforward to composite objects into a scene with reasonable geometry, it is nontrivial to texture such a scene perfectly due to style inconsistency and occlusions between objects. To tackle these problems, we propose a coarse-to-fine 3D scene texturing framework, referred to as RoomTex, to generate high-fidelity and style-consistent textures for untextured compositional scene meshes. In the coarse stage, RoomTex first unwraps the scene mesh to a panoramic depth map and leverages ControlNet to generate a room panorama, which is regarded as the coarse reference to ensure the global texture consistency. In the fine stage, based on the panoramic image and perspective depth maps, RoomTex will refine and texture every single object in the room iteratively along a series of selected camera views, until this object is completely painted. Moreover, we propose to maintain superior alignment between RGB and depth spaces via subtle edge detection methods. Extensive experiments show our method is capable of generating high-quality and diverse room textures, and more importantly, supporting interactive fine-grained texture control and flexible scene editing thanks to our inpainting-based framework and compositional mesh input. Our project page is available at https://qwang666.github.io/RoomTex/.
Persistent Identifierhttp://hdl.handle.net/10722/352491
ISSN
2023 SCImago Journal Rankings: 0.606

 

DC FieldValueLanguage
dc.contributor.authorWang, Qi-
dc.contributor.authorLu, Ruijie-
dc.contributor.authorXu, Xudong-
dc.contributor.authorWang, Jingbo-
dc.contributor.authorWang, Michael Yu-
dc.contributor.authorDai, Bo-
dc.contributor.authorZeng, Gang-
dc.contributor.authorXu, Dan-
dc.date.accessioned2024-12-16T03:59:26Z-
dc.date.available2024-12-16T03:59:26Z-
dc.date.issued2025-
dc.identifier.citationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2025, v. 15126 LNCS, p. 465-482-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/352491-
dc.description.abstractThe advancement of diffusion models has pushed the boundary of text-to-3D object generation. While it is straightforward to composite objects into a scene with reasonable geometry, it is nontrivial to texture such a scene perfectly due to style inconsistency and occlusions between objects. To tackle these problems, we propose a coarse-to-fine 3D scene texturing framework, referred to as RoomTex, to generate high-fidelity and style-consistent textures for untextured compositional scene meshes. In the coarse stage, RoomTex first unwraps the scene mesh to a panoramic depth map and leverages ControlNet to generate a room panorama, which is regarded as the coarse reference to ensure the global texture consistency. In the fine stage, based on the panoramic image and perspective depth maps, RoomTex will refine and texture every single object in the room iteratively along a series of selected camera views, until this object is completely painted. Moreover, we propose to maintain superior alignment between RGB and depth spaces via subtle edge detection methods. Extensive experiments show our method is capable of generating high-quality and diverse room textures, and more importantly, supporting interactive fine-grained texture control and flexible scene editing thanks to our inpainting-based framework and compositional mesh input. Our project page is available at https://qwang666.github.io/RoomTex/.-
dc.languageeng-
dc.relation.ispartofLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)-
dc.subjectScene Generation-
dc.subjectScene Texturing-
dc.subjectTexture Synthesis-
dc.titleRoomTex: Texturing Compositional Indoor Scenes via Iterative Inpainting-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/978-3-031-73113-6_27-
dc.identifier.scopuseid_2-s2.0-85210863353-
dc.identifier.volume15126 LNCS-
dc.identifier.spage465-
dc.identifier.epage482-
dc.identifier.eissn1611-3349-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats