Keywords: exposome, autoimmune disease, multi-modal models, Vision-Language Models (VLMs), Retrieval-Augmented Generation (RAG), personalized medicine, Gemini, Gemma
TL;DR: This paper introduces a multi-modal AI framework that analyzes a patient's lab reports and product usage against biomedical literature to identify specific environmental triggers for their autoimmune disease.
Abstract: The exposome, the totality of an individual's environmental exposures throughout their lifetime is estimated to account for up to 70\% of autoimmune disease risk. Despite this significant contribution, the systematic identification of patient-specific environmental triggers remains an intractable challenge in clinical practice. This translational gap arises from the difficulty of synthesizing vast, heterogeneous data sources: semi-structured clinical lab reports, patient product usage history, and the exponentially growing corpus of biomedical literature on environmental toxicology and immunology. We introduce the Exposome Interpreter, a multimodal framework designed to infer patient-specific relationships between environmental exposures and immunological dysregulation. Our approach first employs fine-tuned Vision-Language Models (VLMs), including Gemini 2.5 Flash and PaliGemma, for high-fidelity information extraction from visually complex lab reports, canonicalizing semi-structured biomarker data into a machine-readable format. Concurrently, a Retrieval-Augmented Generation (RAG) pipeline, leveraging a domain-adapted Gemma model, queries the biomedical literature to construct a knowledge graph linking chemical agents to specific immune pathways. By integrating the structured patient data with this synthesized knowledge base and the patient's product history, the Exposome Interpreter generates ranked, evidence-backed hypotheses for environmental triggers, including direct mapping of abnormal biomarkers to specific consumer products.
Submission Number: 14
Loading