EVA-RNA: A Scaling Cross-Species Transcriptomic Foundation Models for Immunology & Inflammation

Published: 04 Mar 2026, Last Modified: 06 Mar 2026ICLR 2026 Workshop LMRL PosterEveryoneRevisionsBibTeXCC BY 4.0
Confirmation: I have read and agree with the workshop's policy on behalf of myself and my co-authors.
Track: long paper (4–8 pages excluding references)
Keywords: foundation models, transcriptomics, immunology, inflammation, drug development, scaling law, transfer learning, perturbation, clinical translation, LLM
TL;DR: EVA-RNA: 300M-param transcriptomic model trained on 545k human+mouse samples for immunology. Shows power-law scaling without plateau, outperforms existing models on clinical tasks, and learns unified cross-species representations.
Abstract: Recent studies have revealed that transcriptomic foundation models often fail to outperform simple baselines on clinically relevant tasks, suggesting a disconnect between pretraining objectives and useful representations. To bridge this gap, we introduce EVA-RNA, a transformer model pretrained on a curated corpus of over 500k samples spanning human and mouse, including bulk RNA-seq, microarray, and pseudobulked single-cell data, with a focus on Immunology & Inflammation. EVA-RNA exhibits clear power-law scaling across 7M to 300M parameters, with no sign of plateauing, in contrast to prior reports of diminishing returns in single-species models. Also, pretraining improvements consistently translate to downstream performance, as measured by a holistic benchmark spanning drug discovery, preclinical translation, and clinical applications. We finally conduct explainability experiments to explore (i) the concepts in EVA-RNA's representations, (ii) the structure of orthologous genes in latent space, and (iii) the evolution of intrinsic dimensionality across layers and throughout training.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 28
Loading