Abstract: Representation learning for analog circuits is challenging due to the continuous electrical characteristics of devices, compared to the discrete states of digital circuits. While graph neural networks (GNNs) show promise in analog circuit tasks, existing methods neglect the intrinsic electrical properties governing device-specific behaviors. Traditional device feature encoding methods present limitations: one-hot encoding is space-consuming and fails to effectively characterize inter-device similarities, while text encoding introduces erroneous estimation. We propose Ckt2Vec, a novel framework that integrates electrical characteristics into analog circuit representation learning. By encoding frequency-domain embeddings of current-voltage (I-V) curves via a spectral extractor, Ckt2Vec compresses nonlinear device-specific behaviors into low-dimensional embeddings while preserving physical fidelity. A graph-based contrastive learning approach further generates hierarchical circuit representations, capturing both block- and system-level interactions. Evaluated on three downstream tasks, including circuit classification, subcircuit detection, and circuit edit distance prediction, Ckt2Vec outperforms traditional one-hot and text-based encoding methods with less space consumption and better capability in capturing analog behavior.
External IDs:doi:10.1109/tcad.2025.3643366
Loading