Efficient Hyperparameter Optimization Through Tensor CompletionDownload PDF

Published: 01 Feb 2023, Last Modified: 13 Feb 2023Submitted to ICLR 2023Readers: Everyone
Keywords: hyperparameter optimization, tensor completion
TL;DR: An approach for hyperparameter optimization based on tensor completion methods.
Abstract: Hyperparameter optimization is a prerequisite for state-of-the-art performance in machine learning, with current strategies including Bayesian optimisation, Hyperband, and evolutionary methods. Whereas such methods have been shown to improve performance, none of these is designed to explicitly take advantage of the underlying data structure. To this end, we introduce a completely different approach for hyperaparameter optimization, based on low-rank tensor completion. This is achieved by first forming a multi-dimensional tensor which comprises performance scores for different combinations of hyperparameters. Based on the realistic underlying assumption that the so-formed tensor has a low rank structure, this then allows for reliable estimates of the unobserved validation scores of combinations of hyperparameters to be obtained through tensor completion, and from only a fraction of known elements. Through extensive experimentation on various datasets and learning models, the proposed method is shown to exhibit competitive or superior performance to the state-of-the-art hyperparameter optimization strategies. Distinctive advantages of the proposed method include its ability to simultaneously handle any hyperparameter type (e.g., kind of optimizer, number of neurons, number of layer, etc.), its relative simplicity compared to the competing methods, as well as the ability to suggest multiple optimal combinations of hyperparameters.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: General Machine Learning (ie none of the above)
Supplementary Material: zip
16 Replies

Loading