Non-Parametric Inference of Relational DependenceDownload PDF

Published: 20 May 2022, Last Modified: 22 Oct 2023UAI 2022 PosterReaders: Everyone
Keywords: Kernel Methods, Causality, Relational Learning
TL;DR: This paper introduces a definition of marginal and conditional independence for relational data and proposes a consistent, non-parametric, scalable kernel test to operationalize it for non-i.i.d. observations under a set of structural assumptions..
Abstract: Independence testing plays a central role in statistical and causal inference from observational data. Standard independence tests assume that the data samples are independent and identically distributed (i.i.d.) but that assumption is violated in many real- world datasets and applications centered on relational systems. This work examines the problem of estimating independence in data drawn from relational systems by defining sufficient representations for the sets of observations influencing individual instances. Specifically, we define marginal and conditional independence tests for relational data by considering the kernel mean embedding as a flexible aggregation function for relational variables. We propose a consistent, non-parametric, scalable kernel test to operationalize the relational independence test for non-i.i.d. observational data under a set of structural assumptions. We empirically evaluate our proposed method on a variety of synthetic and semi-synthetic networks and demonstrate its effectiveness compared to state-of-the-art kernel-based independence tests.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2207.00163/code)
5 Replies

Loading