Feature Importance Random Search for Hyperparameter Optimization of Data-Consistent Model Inversion

Published: 03 Nov 2023, Last Modified: 04 Nov 2023NeurIPS 2023 Deep Inverse Workshop PosterEveryoneRevisionsBibTeX
Keywords: Hyperparameter Optimization, HPO, Random Search, Feature Importance, Data Consistent Model Inversion, sDCMI, DCIMI, Mechanistic Model
TL;DR: A novel end-to-end HPO of data consistent inverse modeling that applies an iterative feature importance51 based gradient update to guide random search.
Abstract: We consider hyperparameter optimization (HPO) of approaches that employ outputs of mechanistic models as priors in hybrid modeling for data consistent inversion. An implicit density estimator (DE) models a non-parametric distribution of model input parameters, and the push forward of those generated samples produces a model output distribution that should match a target distribution of observed data. A rejection sampler then filters out “undesirable” samples through a discriminator function. In a samples-generate-reject pipeline with the objective of fitting the push-forward to the observed experimental outputs, several DEs can be employed within the generator and discriminator components. However, the extensive evaluation of these end-to-end inversion frameworks is still lacking. Specifically, this data-consistent model inversion pipeline offers an extra challenge concerning optimization of constituent models. Traditional HPO are often limited to single-model scenarios and might not directly map to frameworks that optimize several models to achieve a single loss. To overcome the time overhead due to summative optimization of each component, and the expanded combinatorial search space, we introduce a method that performs an initial random search to bootstrap a HPO that applies weighted feature importance to gradually update the hyperparameter set, periodically probing the pipeline to track the loss. Our experiments show reduced number of time intensive pipeline runs but with the faster convergence.
Submission Number: 32
Loading