File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Texture Generation on 3D Meshes with Point-UV Diffusion
Title | Texture Generation on 3D Meshes with Point-UV Diffusion |
---|---|
Authors | |
Issue Date | 2-Oct-2023 |
Abstract | In this work, we focus on synthesizing high-quality textures on 3D meshes. We present Point-UV diffusion, a coarse-to-fine pipeline that marries the denoising diffusion model with UV mapping to generate 3D consistent and high-quality texture images in UV space. We start with introducing a point diffusion model to synthesize lowfrequency texture components with our tailored style guidance to tackle the biased color distribution. The derived coarse texture offers global consistency and serves as a condition for the subsequent UV diffusion stage, aiding in regularizing the model to generate a 3D consistent UV texture image. Then, a UV diffusion model with hybrid conditions is developed to enhance the texture fidelity in the 2D UV space. Our method can process meshes of any genus, generating diversified, geometry-compatible, and high-fidelity textures. Code is available at https://cvmilab.github.io/Point-UV-Diffusion. |
Persistent Identifier | http://hdl.handle.net/10722/340970 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yu, Xin | - |
dc.contributor.author | Dai, Peng | - |
dc.contributor.author | Li, Wenbo | - |
dc.contributor.author | Ma, Lan | - |
dc.contributor.author | Liu, Zhengzhe | - |
dc.contributor.author | Qi, Xiaojuan | - |
dc.date.accessioned | 2024-03-11T10:48:42Z | - |
dc.date.available | 2024-03-11T10:48:42Z | - |
dc.date.issued | 2023-10-02 | - |
dc.identifier.uri | http://hdl.handle.net/10722/340970 | - |
dc.description.abstract | <p>In this work, we focus on synthesizing high-quality textures on 3D meshes. We present Point-UV diffusion, a coarse-to-fine pipeline that marries the denoising diffusion model with UV mapping to generate 3D consistent and high-quality texture images in UV space. We start with introducing a point diffusion model to synthesize lowfrequency texture components with our tailored style guidance to tackle the biased color distribution. The derived coarse texture offers global consistency and serves as a condition for the subsequent UV diffusion stage, aiding in regularizing the model to generate a 3D consistent UV texture image. Then, a UV diffusion model with hybrid conditions is developed to enhance the texture fidelity in the 2D UV space. Our method can process meshes of any genus, generating diversified, geometry-compatible, and high-fidelity textures. Code is available at https://cvmilab.github.io/Point-UV-Diffusion.</p> | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE International Conference on Computer Vision 2023 (02/10/2023-06/10/2023, Paris) | - |
dc.title | Texture Generation on 3D Meshes with Point-UV Diffusion | - |
dc.type | Conference_Paper | - |