ReproPheno and ReproPhenoNet: A Large-Scale Multimodal Benchmark Dataset and Deep Learning Framework for Reproductive-Stage Plant Phenotyping

Published: 09 Dec 2025, Last Modified: 25 Jan 2026AgriAI 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Reproductive-stage phenotyping, benchmark dataset, multimodal image sequence analysis, deep neural networks.
Abstract: High-throughput plant phenotyping is an emerging interdisciplinary field of research that lies at the intersection of computer vision, plant science, artificial intelligence, data analytics and visualization, and genomics. To develop novel algorithms that advance image-based high-throughput plant phenotyping research with an aim of precision farming and global food security, the publication of benchmark datasets is indispensable. Thus, this paper introduces ReproPheno, a large-scale, open-source dataset comprising multi-view image sequences of plants spanning their entire life cycle captured using three modalities of cameras, i.e., visible light, fluorescent, and hyperspectral, within the LemnaTec Scanalyzer 3D high-throughput plant phenotyping facility. By enabling comprehensive analysis across multiple sensing modalities and temporal scales, this dataset opens the door to several novel research directions in computer vision, including multimodal co-segmentation, hyperspectral dataset distillation, and 3D model reconstruction of living organisms with growing architectural complexity. Furthermore, the paper presents ReproPhenoNet, a novel algorithm that uses You Only Look Once (YOLO) deep neural-network-based object detector to detect flowers and fruits from visible light and hyperspectral image sequences, respectively. These detections form the foundation for the quantitative computation of reproductive-stage plant phenotypes.
Submission Number: 4
Loading