Wearables As Graph: Personalized Health Insights via Dynamic Retrieval from Adaptive Knowledge Graphs

14 Sept 2025 (modified: 21 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Wearable Data, Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), Knowledge Graphs
Abstract: The growing prevalence of multi-sensor wearable devices generates vast amounts of long-term, multimodal time-series data, posing significant challenges for manual analysis and context-aware Large Language Models (LLMs). Current LLM-based health analysis methods typically rely on manually curated context, which becomes impractical with increasing data volume and sensor diversity. To overcome these limitations, we introduce \textbf{Wearable As Graph (WAG)}, a novel framework that automates context retrieval for LLMs using personalized knowledge graphs. WAG constructs knowledge graphs mapping relationships between wearable modalities and incorporates user-specific data. We develop a data-driven retrieval pipeline that leverages both global (long-term) and local (short-term) relationships within metrics to identify the most relevant nodes for user queries. We evaluate WAG on a benchmark of over 10k data-associated queries created from multiple wearable datasets. Both LLM- and human-based evaluations show that WAG substantially improves response quality, achieving a $\sim$70\% win rate over baseline methods. Ablation studies further demonstrate the complementary value of global modeling (implemented via Hierarchical Bayesian Modeling to integrate general knowledge, population trends, and individual variation) and local modeling (adapted based on anomalies and query openness). WAG pioneers a wearable knowledge graph, a tailored retrieval algorithm, and a real-data based query set, creating a foundation for future research in wearable-based health monitoring.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 4945
Loading