File Download

There are no files associated with this item.

Supplementary

Conference Paper: Texture Generation on 3D Meshes with Point-UV Diffusion

TitleTexture Generation on 3D Meshes with Point-UV Diffusion
Authors
Issue Date2-Oct-2023
Abstract

In this work, we focus on synthesizing high-quality textures on 3D meshes. We present Point-UV diffusion, a coarse-to-fine pipeline that marries the denoising diffusion model with UV mapping to generate 3D consistent and high-quality texture images in UV space. We start with introducing a point diffusion model to synthesize lowfrequency texture components with our tailored style guidance to tackle the biased color distribution. The derived coarse texture offers global consistency and serves as a condition for the subsequent UV diffusion stage, aiding in regularizing the model to generate a 3D consistent UV texture image. Then, a UV diffusion model with hybrid conditions is developed to enhance the texture fidelity in the 2D UV space. Our method can process meshes of any genus, generating diversified, geometry-compatible, and high-fidelity textures. Code is available at https://cvmilab.github.io/Point-UV-Diffusion.


Persistent Identifierhttp://hdl.handle.net/10722/340970

 

DC FieldValueLanguage
dc.contributor.authorYu, Xin-
dc.contributor.authorDai, Peng-
dc.contributor.authorLi, Wenbo-
dc.contributor.authorMa, Lan-
dc.contributor.authorLiu, Zhengzhe-
dc.contributor.authorQi, Xiaojuan-
dc.date.accessioned2024-03-11T10:48:42Z-
dc.date.available2024-03-11T10:48:42Z-
dc.date.issued2023-10-02-
dc.identifier.urihttp://hdl.handle.net/10722/340970-
dc.description.abstract<p>In this work, we focus on synthesizing high-quality textures on 3D meshes. We present Point-UV diffusion, a coarse-to-fine pipeline that marries the denoising diffusion model with UV mapping to generate 3D consistent and high-quality texture images in UV space. We start with introducing a point diffusion model to synthesize lowfrequency texture components with our tailored style guidance to tackle the biased color distribution. The derived coarse texture offers global consistency and serves as a condition for the subsequent UV diffusion stage, aiding in regularizing the model to generate a 3D consistent UV texture image. Then, a UV diffusion model with hybrid conditions is developed to enhance the texture fidelity in the 2D UV space. Our method can process meshes of any genus, generating diversified, geometry-compatible, and high-fidelity textures. Code is available at https://cvmilab.github.io/Point-UV-Diffusion.</p>-
dc.languageeng-
dc.relation.ispartofIEEE International Conference on Computer Vision 2023 (02/10/2023-06/10/2023, Paris)-
dc.titleTexture Generation on 3D Meshes with Point-UV Diffusion-
dc.typeConference_Paper-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats