Geometric Considerations for Normalization Layers in Equivariant Neural NetworksDownload PDF

Published: 22 Nov 2022, Last Modified: 05 May 2023AI4Mat 2022 PosterReaders: Everyone
Abstract: In recent years, neural networks that incorporate physical symmetry in their architecture have become indispensable tools for overcoming the scarcity of molecular and material data. However, despite its critical importance in deep learning, the design and selection of the normalization layer has often been treated as a side issue. In this study, we first review the unique challenges that batch normalization (BatchNorm) faces in its application to materials science and provide an overview of alternative normalization layers that can address the unique geometric considerations required by physical systems and tasks. While the challenges are diverse, we find that \emph{geometric-match} of a normalization layer can be achieved by ensuring that the normalization preserves not only invariance and equivariance, but also covariance of the task and dataset. Overall, our survey provides a coherent overview of normalization layers for practitioners and presents open-challenges for further developments.
Paper Track: Behind the Scenes
Submission Category: Other
0 Replies

Loading