A physics-embedded deep learning framework for cloth simulation
ID:68
Submission ID:260 View Protection:ATTENDEE
Updated Time:2024-10-08 17:44:19
Hits:91
Poster Presentation
Abstract
Delicate cloth simulations have long been desired in computer graphics. Various methods were proposed to improve engaged force interactions, collision handling, and numerical integrations. Deep learning has the potential to achieve fast and real-time simulation, but common neural network (NN) structures often demand many parameters to capture cloth dynamics. This paper proposes a physics-embedded learning framework that directly encodes physical features of cloth simulation. The convolutional neural network is used to represent spatial correlations of the mass-spring system, after which three branches are designed to learn linear, nonlinear, and time derivate features of cloth physics. The framework can also integrate with other external forces and collision handling through either traditional simulators or sub neural networks. The model is tested across different cloth animation cases, without training with new data. Agreement with baselines and predictive realism successfully validate its generalization ability. Inference efficiency of the proposed model also defeats traditional physics simulation. This framework is also designed to easily integrate with other visual refinement techniques like wrinkle carving, which leaves significant chances to incorporate prevailing macing learning techniques in 3D cloth amination.
Keywords
Computer Graphics,Cloth Simulation,Machine Learning,Physics-based Animation
Submission Author
Zhiwei Zhao
UM-SJTU Joint Institute, Shanghai Jiao Tong University
Comment submit