Beyond Berry Esseen: Structure and Learning of Sums of Random Variables
Konstantinos Daskalakis ( MIT )
- 15:00 1st December 2014 ( week 8, Michaelmas Term 2014 )Lecture Theatre B, Wolfson Building, Parks Road
It is well-known that ~1/eps^2 coin tosses are necessary and sufficient to estimate the bias of a coin. But how many samples are needed to learn the distribution of the sum of n independent Bernoullis? I will show that the answer is still O(1/eps^2), independent of n, and will generalize this bound to sums of independent bounded integer random variables (SIIRVs). I will argue that the celebrated Berry-Esseen theorem and its variants fall short from characterizing the general structure of SIIRVs, and offer stronger finitary central limit theorems, which tightly characterizing their structure and have the aforementioned implications to learning.