ProxyTune: Hyperparameter tuning through iteratively refined proxies

Published: 17 Jun 2024, Last Modified: 19 Jul 20242nd SPIGM @ ICML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Hyperparameter Tuning, Causality, Generative Model, Structure learning
TL;DR: We investigate existing hyperparameter tuning methods on causal discovery without the access to the ground truth, and propose an iterative refinement procedure showing better performance.
Abstract: Tuning the hyperparameters of machine learning algorithms against a target metric is an essential way of ensuring good performance on tasks. However, in areas such as causal machine learning the target metric may not be accessible due to the lack of ground truths. In this work, we compare two existing approaches and propose an extension, which iteratively refines proxies towards the dataset, called ProxyTune. This allows constructing previously unavailable metrics through proxies, which enables the existing hyperparameter tuning methods. We focus on the causal discovery, where the ground truth graph is unavailable. Our preliminary results on synthetic data show the ineffectiveness of existing approaches and the advantages of the iterative refinement.
Submission Number: 97
Loading