Dual-Augmentation Graph Contrastive Pretraining for Transistor-level Integrated Circuits

19 Sept 2025 (modified: 12 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Electronic Design Automation, Pretraining, Self-supervised Learning, Graph Contrastive Learning
TL;DR: DICE is a GNN pretrained with graph contrastive learning, leveraging two transistor-level augmentations to generate >10,000 times more diverse circuits and achieve strong gains on three downstream tasks.
Abstract: Structural information is the essence of graph data, and pretraining graph models aims to train neural networks that efficiently extract topological features. In electronic design automation (EDA), recent approaches use self-supervised graph pretraining to learn circuit representations and then fine-tune these pretrained models for downstream applications such as circuit performance prediction. However, due to limited circuit structures and the predominant focus on gate-level modeling, most existing methods can only extract structural information related to specific tasks or circuit types (e.g., digital). To address this issue, we propose DICE: Device-level Integrated Circuits Encoder, a graph neural network (GNN) pretrained at the transistor level. DICE models any circuit regardless of the signal they process, and is task-agnostic through graph contrastive learning using diverse circuit topologies. Our key contribution is the introduction of a dual-augmentation technique that generates over $10,000\times$ more topologies than prior transistor-level work, thereby substantially increasing the structural diversity of circuits. Experimental results on three downstream tasks demonstrate significant performance gains on graph-level predictions, underscoring the effectiveness of DICE for transistor-level circuits. Codes are released at https://anonymous.4open.science/r/DICE-ICLR2026.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 19883
Loading