Learning And Testing Causal Models: A Property Testing Viewpoint

APA

(2022). Learning And Testing Causal Models: A Property Testing Viewpoint. The Simons Institute for the Theory of Computing. https://simons.berkeley.edu/talks/learning-and-testing-causal-models-property-testing-viewpoint

MLA

Learning And Testing Causal Models: A Property Testing Viewpoint. The Simons Institute for the Theory of Computing, Feb. 15, 2022, https://simons.berkeley.edu/talks/learning-and-testing-causal-models-property-testing-viewpoint

BibTex

          @misc{ scivideos_19666,
            doi = {},
            url = {https://simons.berkeley.edu/talks/learning-and-testing-causal-models-property-testing-viewpoint},
            author = {},
            keywords = {},
            language = {en},
            title = {Learning And Testing Causal Models: A Property Testing Viewpoint},
            publisher = {The Simons Institute for the Theory of Computing},
            year = {2022},
            month = {feb},
            note = {19666 see, \url{https://scivideos.org/index.php/Simons-Institute/19666}}
          }
          
Arnab Bhattacharyya (National University of Singapore, Singapore)
Source Repository Simons Institute

Abstract

We consider testing and learning problems on causal Bayesian networks where the variables take values from a bounded domain. We address two problems: (i) Given access to observations and experiments on two unknown environments X and Y, test whether X=Y or X is far from Y. Here, two environments are equal if no intervention can distinguish between them. (ii) Given access to observations and experiments on an unknown environment X, learn a DAG that admits a causal model M such that X is close to M. For problem (i), we show that under natural sparsity assumptions on the underlying DAG, only O(log n) interventions and O~(n) samples/intervention is sufficient. This is joint work with Jayadev Acharya, Constantinos Daskalakis and Saravanan Kandasamy. For problem (ii), we consider the setting where there are two variables, and the goal is to learn whether X causes Y, Y causes X, or there is a hidden variable confounding the two. Under natural assumptions, we obtain a nearly tight characterization of the sample complexity that is sublinear in k. Moreover, there is a tradeoff between the number of observational samples and interventional samples. This is joint work with Jayadev Acharya, Sourbh Bhadane, Saravanan Kandasamy, and Ziteng Sun.