Sherpa: Hyperparameter Optimization for Machine Learning Models

Lars Hertel, Julian Collado, Peter Sadowski, Pierre Baldi

Sep 30, 2018 NIPS 2018 Workshop MLOSS Submission readers: everyone
  • Abstract: Sherpa is a free open-source hyperparameter optimization library for machine learning models. It is designed for problems with computationally expensive iterative function evaluations, such as the hyperparameter tuning of deep neural networks. With Sherpa, scientists can quickly optimize hyperparameters using a variety of powerful and interchangeable algorithms. Additionally, the framework makes it easy to implement custom algorithms. Sherpa can be run on either a single machine or a cluster via a grid scheduler with minimal configuration. Finally, an interactive dashboard enables users to view the progress of models as they are trained, cancel trials, and explore which hyperparameter combinations are working best. Sherpa empowers machine learning researchers by automating the tedious aspects of model tuning and providing an extensible framework for developing automated hyperparameter-tuning strategies. Its source code and documentation are available at https://github.com/LarsHH/sherpa and https://parameter-sherpa.readthedocs.io/, respectively. A demo can be found at https://youtu.be/L95sasMLgP4.
  • TL;DR: Hyperparameter optimization in Python with dashboard, including Bayesian Optimization, Population Based Training, and more algorithms.
  • Keywords: Hyperparameter Optimization, Open source software, Deep Learning
0 Replies

Loading