When YOLO Meets SAM: Data-Efficient Weed Density Estimation

Published: 09 Dec 2025, Last Modified: 25 Jan 2026AgriAI 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: YOLO, Segment Anything Model (SAM), Limited Annotations, Weed Density Estimation, Precision Agriculture
TL;DR: A lightweight YOLO–SAM hybrid framework that accurately segments and estimates weed density using minimal annotated data, enabling efficient precision farming under limited-data conditions.
Abstract: Weed infestation is a persistent problem in agriculture, particularly in organic farming, where chemical herbicides are restricted and manual weeding is labour-intensive and costly. Accurate and automated weed identification is therefore critical for sustainable crop management, yet conventional detection and segmentation models demand extensive annotated datasets, making large-scale deployment impractical. To address this, the present study introduces a zero-shot instance segmentation framework that integrates the YOLO segmentation model with the Segment Anything Model (SAM). YOLO generates precise bounding boxes for weed regions, which SAM refines to produce high-quality masks for accurate weed-density estimation. This cooperative approach combines YOLO’s strong localisation with SAM’s fine-grained segmentation, achieving robust results even with minimal annotated data. The proposed framework significantly reduces annotation and computational costs while maintaining high accuracy, offering an efficient and scalable solution for precision weed management in agriculture.
Submission Number: 7
Loading