Bayesian Optimized Meta-Learning for Uncertainty-Driven Optimization

NeurIPS 2024 Workshop BDU Submission113 Authors

06 Sept 2024 (modified: 10 Oct 2024)Submitted to NeurIPS BDU Workshop 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian Optimization, Meta Learning, Uncertainty Optimization
TL;DR: BOML for Uncertainty-Driven Optimization
Abstract: This paper introduces a Bayesian-optimized meta-learning framework aimed at enhancing model performance in uncertain and noisy industrial environments. By integrating Bayesian Optimization with Model-Agnostic Meta-Learning (MAML), our approach dynamically fine-tunes model parameters for robust performance. The framework effectively identifies global minima despite uncertainty by utilizing Gaussian Process models with the Matérn kernel and the Maximum Probability of Improvement (MPI) acquisition function. Covariance analysis aligns training and validation losses, while L2 regularization prevents overfitting. Experimental results demonstrate the framework's ability to balance accuracy and generalization, making it suitable for diverse industrial optimization tasks.
Submission Number: 113
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview