Graph Neural Networks with Directional Encodings for Anisotropic Elasticity

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph Neural Networks, Anisotropic Elasticity, Elastodynamics
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We enhance standard mesh graph neural network with directional encodings to represent material anisotropy for elastodynamics simulation
Abstract: Simulating the behavior of nonlinear and anisotropic materials is a central problem with applications across engineering, computer graphics, robotics, and beyond. While conventional mesh-based simulations provide accurate and reliable predictions, their computational overhead typically prevents their use in interactive applications. Graph neural networks (GNN) have recently emerged as a compelling alternative to conventional simulations for time-critical applications. However, existing GNN-based methods cannot distinguish between deformations in different directions and are thus limited to isotropic materials. To address this limitation, we propose a novel and easy-to-implement GNN architecture based on directional encodings of edge features. By preserving directional information during message passing, our method has access to the full state of deformation and can thus model anisotropic materials. We demonstrate through a set of qualitative and quantitative evaluations that our approach outperforms existing mesh-based GNN approaches for modeling anisotropic materials.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6889
Loading