Learning Sparse Nonparametric DAGsDownload PDF

09 Sept 2020 (modified: 11 Apr 2022)OpenReview Archive Direct UploadReaders: Everyone
Abstract: We develop a framework for learning sparse nonparametric directed acyclic graphs (DAGs) from data. Our approach is based on a recent algebraic characterization of DAGs that led to the first fully continuous optimization for score-based learning of DAG models parametrized by a linear structural equation model (SEM). We extend this algebraic characterization to nonparametric SEM by leveraging nonparametric sparsity based on partial derivatives, resulting in a continuous optimization problem that can be applied to a variety of nonparametric and semiparametric models including GLMs, additive noise models, and index models as special cases. Unlike existing approaches that require specific modeling choices, loss functions, or algorithms, we present a completely general framework that can be applied to general nonlinear models (eg without additive noise), general differentiable loss functions, and generic black-box optimization routines.
0 Replies

Loading