Geometry-Aware Equivariant Attention for Scalable Nanoelectronic Property Prediction

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Equivariant Neural Networks, Material Simulation, Nanoelectronics
TL;DR: We introduce a novel boundary attention mechanism for scalable material property prediction of nanoscale structures
Abstract: All advanced nanoelectronic devices, including transistors, image sensors, and LEDs, rely on materials and interfaces scaled down to a few nanometers. At these dimensions, material properties change in nontrivial ways due to quantum confinement and atomic-level variability, creating a multi-scale modeling challenge that requires atomistic simulations for accurate prediction. However, such simulations are often prohibitively slow or intractable, making highly expensive iterative rounds of experimentation the default option. In this work we introduce EBFormer, a geometry-aware equivariant neural network that predicts electronic properties of nanostructures by jointly capturing atomistic interactions and global geometric effects, achieving orders of magnitude speed-up over state-of-the-art physical simulators while preserving high accuracy. This is accomplished through the introduction of a novel boundary cross-attention mechanism, a scalable approach to augment local graph convolution with information of the nanostructure geometry. We validate EBFormer for nanowire and nanosheet transistors - representative of the most advanced nanoelectronic architectures currently in use - and show improved in-distribution and out-of-distribution performance on both material properties and downstream device characteristics. Combined with superior asymptotic scalability and data- and parameter-efficiency, our work paves a pathway to atomistic, automated, high-throughput and predictive nanoscale design that is otherwise not available today.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 23577
Loading