On the Intersection Property of Conditional Independence and its Application to Causal Discovery

3 Mar 2014  ·  Jonas Peters ·

This work investigates the intersection property of conditional independence. It states that for random variables $A,B,C$ and $X$ we have that $X$ independent of $A$ given $B,C$ and $X$ independent of $B$ given $A,C$ implies $X$ independent of $(A,B)$ given $C$. Under the assumption that the joint distribution has a continuous density, we provide necessary and sufficient conditions under which the intersection property holds. The result has direct applications to causal inference: it leads to strictly weaker conditions under which the graphical structure becomes identifiable from the joint distribution of an additive noise model.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here