DAG-based Generative Regression

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Generative regression modeling, DAG-learning, Generative adversarial learning, Causal discovery, Additive noise model
Abstract: Standard regression models address associations between targeted dependent variables and selected independent variables. This paper generalizes this by proposing DAG-based generative regression as a generative process in which the model learns the data generation mechanism from real data. DAG is explicitly involved in the generative process by using structural equation models to capture the data generation mechanisms among the data variables. We learn DAG by reconstructing the model to replicate the real data distribution. We have conducted experiments to measure the performance of our algorithm to show that the results outperform the state-of-the-art by a significantly large margin.
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5528
Loading