File Download

There are no files associated with this item.

Supplementary

Conference Paper: Learning 3D Garment Animation from Trajectories of A Piece of Cloth

TitleLearning 3D Garment Animation from Trajectories of A Piece of Cloth
Authors
Issue Date5-Jun-2025
Abstract

Garment animation is ubiquitous in various applications, such as virtual reality, gaming, and film production. Learning-based approaches obtain compelling performance in animating diverse garments under versatile scenarios. Nevertheless, to mimic the deformations of the observed garments, data-driven methods often require large-scale garment data, which are both resource-expensive and time-consuming. In addition, forcing models to match the dynamics of observed garment animation may hinder the potential to generalize to unseen cases. In this paper, instead of using garment-wise supervised learning we adopt a disentangled scheme to learn how to animate observed garments: 1) learning constitutive behaviors from the observed cloth; 2) dynamically animate various garments constrained by the learned constitutive laws. Specifically, we propose an Energy Unit network (EUNet) to model the constitutive relations in the form of energy. Without the priors from analytical physics models and differentiable simulation engines, EUNet is able to directly capture the constitutive behaviors from the observed piece of cloth and uniformly describes the change of energy caused by deformations, such as stretching and bending. We further apply the pre-trained EUNet to animate various garments based on energy optimizations. The disentangled scheme alleviates the need for garment data and enables us to utilize the dynamics of a piece of cloth for animating garments. Experiments show that EUNet effectively delivers the energy gradients due to the deformations. Models constrained by EUNet achieve more stable and physically plausible performance compare with those trained in a garment-wise supervised manner. Code is available at https://github.com/ftbabi/EUNet_NeurIPS2024.git.


Persistent Identifierhttp://hdl.handle.net/10722/359496

 

DC FieldValueLanguage
dc.contributor.authorShao, Yidi-
dc.contributor.authorChen, Change Loy-
dc.contributor.authorDai, Bo-
dc.date.accessioned2025-09-07T00:30:43Z-
dc.date.available2025-09-07T00:30:43Z-
dc.date.issued2025-06-05-
dc.identifier.urihttp://hdl.handle.net/10722/359496-
dc.description.abstract<p>Garment animation is ubiquitous in various applications, such as virtual reality, gaming, and film production. Learning-based approaches obtain compelling performance in animating diverse garments under versatile scenarios. Nevertheless, to mimic the deformations of the observed garments, data-driven methods often require large-scale garment data, which are both resource-expensive and time-consuming. In addition, forcing models to match the dynamics of observed garment animation may hinder the potential to generalize to unseen cases. In this paper, instead of using garment-wise supervised learning we adopt a disentangled scheme to learn how to animate observed garments: 1) learning constitutive behaviors from the observed cloth; 2) dynamically animate various garments constrained by the learned constitutive laws. Specifically, we propose an Energy Unit network (EUNet) to model the constitutive relations in the form of energy. Without the priors from analytical physics models and differentiable simulation engines, EUNet is able to directly capture the constitutive behaviors from the observed piece of cloth and uniformly describes the change of energy caused by deformations, such as stretching and bending. We further apply the pre-trained EUNet to animate various garments based on energy optimizations. The disentangled scheme alleviates the need for garment data and enables us to utilize the dynamics of a piece of cloth for animating garments. Experiments show that EUNet effectively delivers the energy gradients due to the deformations. Models constrained by EUNet achieve more stable and physically plausible performance compare with those trained in a garment-wise supervised manner. Code is available at https://github.com/ftbabi/EUNet_NeurIPS2024.git.<br></p>-
dc.languageeng-
dc.relation.ispartofNeural Information Processing Systems (NeurIPS), 2024 (10/12/2024-15/12/2024, Vancouver, Canada)-
dc.titleLearning 3D Garment Animation from Trajectories of A Piece of Cloth-
dc.typeConference_Paper-
dc.identifier.spage41803-
dc.identifier.epage41825-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats