Contrastive Time Series Representation Learning for Neurochemical Concentration Prediction

Published: 23 Sept 2025, Last Modified: 01 Dec 2025TS4H NeurIPS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Cyclic voltammetry, Electrochemical biosensing, Metadata-aware contrastive learning, Time-series representation learning
TL;DR: We present a metadata-aware contrastive learning framework that overcomes analyte interactions, normalization limits, and batch effects to enable accurate multiplex neurochemical concentration prediction from cyclic voltammetry data.
Abstract: Accurately characterizing the neurochemical environment is essential for advancing the understanding and treatment of neurological and psychiatric disorders. Fast-scan cyclic voltammetry (FSCV) enables high-temporal-resolution measurement of neurotransmitter dynamics, but predicting multiplex concentrations in complex fluids remains an open challenge. We identify three key obstacles: nonlinear analyte interactions, the limited utility of normalization, and pronounced batch effects stemming from sensor fabrication variability. To address these, we propose a metadata-aware contrastive representation learning framework that explicitly incorporates batch identity, scan sequence, and scan rate to model experimental variability. Preference-based ranking losses emphasize subtle yet discriminative features of voltammetry curves, while the learned representations are decoupled from downstream predictors. A convolutional neural network is then applied to capture non-linearity in concentration prediction. Preliminary results show improved accuracy over traditional baselines, highlighting a promising direction at the intersection of time-series representation learning and multiplexed biosensing.
Submission Number: 85
Loading