ALGOTHEORY SEMINAR

Near-optimal learning of tree-structured distributions

Speaker
Dr Sutanu Gayen, Research Fellow, National University of Singapore https://www.comp.nus.edu.sg/~sutanu/
Chaired by
Dr Arnab BHATTACHARYYA, Associate Professor, School of Computing
arnab@comp.nus.edu.sg

26 Apr 2021 Monday, 02:00 PM to 03:00 PM

via Zoom

Abstract:
We provide finite sample guarantees for the classical Chow-Liu algorithm (IEEE Trans. Inform. Theory, 1968) to learn a tree-structured graphical model of a distribution. For a distribution P on [k]^n and a tree T on n nodes, we say T is an eps-approximate tree for P if there is a T-structured distribution Q such that D(P||Q) is at most eps more than the best possible tree-structured distribution for P. We show that if P itself is tree-structured, then the Chow-Liu algorithm with the plug-in estimator for mutual information with O~(k^3 n/eps) i.i.d. samples outputs an eps-approximate tree for P with constant probability. In contrast, for a general P (which may not be tree-structured), Omega(n^2/eps^2) samples are necessary to find an eps-approximate tree. Our upper bound is based on a new conditional independence tester that addresses an open problem posed by Canonne, Diakonikolas, Kane, and Stewart (STOC, 2018): we prove that for three random variables X, Y, Z each over [k], testing if I(X;Y | Z) is 0 or >eps is possible with O~(k^3/eps) samples. Finally, we show that for a specific tree T, with O~(k^2 n/eps) samples from a distribution P over [k]^n, one can efficiently learn the closest T-structured distribution in KL divergence by applying the add-1 estimator at each node.

Joint work with Arnab Bhattacharyya, Eric Price and N.V. Vinodchandran


Biodata:
Sutanu Gayen is a research fellow at the School of Computing, NUS. He is interested in developing provably correct and efficient algorithms for statistical and causal inference. He is also interested in streaming algorithms and probabilistic algorithms in general.