Sequential Automated Machine Learning: Bandits-driven Exploration using a Collaborative Filtering RepresentationDownload PDF

Published: 14 Jul 2021, Last Modified: 05 May 2023AutoML@ICML2021 PosterReaders: Everyone
Keywords: automl, collaborative filtering, bandits, sequential automl
Abstract: The goal of Automated Machine Learning (AutoML) is to make Machine Learning (ML) tools more accessible. Collaborative Filtering (CF) methods have shown great success in automating the creation of machine learning pipelines. In this work, we frame the AutoML problem under a sequential setting where datasets arrive one at a time. On each dataset, an agent can try a small number of pipelines (exploration) before recommending a pipeline for this dataset (recommendation). The goal is to maximize the performance of the recommended pipelines over the sequence of datasets. More specifically, we focus on the exploration policy used for selecting the pipelines to explore before making the recommendation. We propose an approach based on the LinUCB bandit algorithm that leverages the latent representations extracted from matrix factorization (MF). We show that the exploration policy impacts the recommendation performance and that MF-based latent representations are more useful for exploration than for recommendation.
Ethics Statement: The goal of the proposed work is to support the data scientist in their modeling tasks. This work does not aim at replacing data scientists. The pipeline recommendations shall be controlled by a data scientist first before being deployed. The knowledge used to support the recommendations is updated after each pipeline optimization request. The framework includes a total of 175 pipelines, but some are less efficient than others. After each request, the the agent tends to consolidate its knowledge on a subset of well performing pipelines, discriminating against the bad performing ones. This bias is acceptable, because the end goal of this work is to recommend efficient pipelines to data scientists. In such setting, the quality of the recommendation prevails over the diversity of the recommendations.
Crc Pdf: pdf
Poster Pdf: pdf
Original Version: pdf
4 Replies

Loading