Rutgers Discrete Mathematics Seminar

Title: Improved Matrix Chernoff Bounds for Smooth Distributions

Speaker: Nikhil Srivastava, Princeton

Date: Tuesday, April 10, 2012 2:00pm

Location: Hill Center, Room 124, Rutgers University, Busch Campus, Piscataway, NJ


We study the following basic question: given a vector-valued random variable X in R^n, how many i.i.d. samples X_1,ldots, X_q are required so that the empirical covariance matrix (1/q)sum_{ile q} X_iX_i^T approximates the actual covariance matrix E XX^T? A remarkably general result of M. Rudelson shows that when ||X||le O(sqrt{n}) almost surely, then q=O(nlogn/epsilon^2) samples suffice to obtain an epsilon-approximation in the spectral norm. This result and the related Ahlswede-Winter matrix Chernoff bound have had numerous applications in convex geometry, numerical linear algebra, and computer science. Rudelson's bound is known to be tight in this generality, but can be improved to O(n/epsilon^2) in the case of Gaussian and log-concave X (for instance, X distributed uniformly on a convex body). We provide an elementary proof that O(n) samples suffice under a more general assumption bounded 2+delta moments of all marginals which extends the class of distributions for which the O(logn) oversampling is not required. The proof is based on a randomization of a technique that was previously used to construct O(n) size spectral sparsifiers of graphs. Joint work with Roman Vershynin