Abstract: In 3D printing material defect detection, environmental variations frequently induce false positives and missed detections by automated systems. Current research addresses dynamic environments by fine-tuning models during the detection phase. However, existing methods suffer from delayed adaptation: models require prolonged iterations to achieve stable performance in new environments. Furthermore, noise-contaminated pseudo-labels during domain adaptation exacerbate error accumulation and catastrophic forgetting, leading to severe performance degradation of the model. To address these challenges, we propose a robust and rapid adaptive detection framework tailored for dynamic environments. First, we innovatively employ the Gram matrix of model feature layers to quantify environmental shifts, endowing the model with real-time environmental awareness. Second, the dynamically maintained sample buffer ensures that stored samples satisfy three critical properties: pseudo-label reliability, class-balanced distribution, and diverse environmental representation. This mechanism selects samples most representative of new environments for fine-tuning, significantly accelerating adaptation. Experimental results demonstrate that our method achieves superior detection accuracy on real-world 3D printing material datasets under complex scenarios (e.g., sudden illumination changes and environmental shifts). Compared to baseline Test-Time Adaptation (TTA) methods, it exhibits enhanced adaptability and robustness.
External IDs:dblp:conf/case/LiSFGDWXW25
Loading