Title: Recovery Guarantee of Weighted Low-Rank Approximation via Alternating Minimization
Speaker: Yingyu Liang, Princeton University
Date: Wednesday, April 6, 2016 11:00am-12:00pm
Location: CoRE Bldg, Room 301, Rutgers University, Busch Campus, Piscataway, NJ
Abstract:
Many applications require recovering a ground truth low-rank matrix from noisy observations of the entries. In practice, this is typically formulated as a weighted low-rank approximation problem and solved using non-convex optimization heuristics such as alternating minimization. Such non-convex techniques have few guarantees. Even worse, weighted low-rank approximation is NP-hard for even the most simple case when the ground truth is a rank-1 matrix.
Under moderate assumptions on the weight matrix and the ground truth, we prove that the vanilla alternating minimization algorithm with a simple and cheap "clipping" step, which zeroes out rows with high l2 norms in the intermediate solutions, recovers the ground truth. In particular, we bound the spectral norm of the difference between the recovered matrix and the ground truth, by the spectral norm of the weighted noise plus an additive error term that decreases exponentially with the number of rounds of alternating minimization. This provides the first recovery guarantee for weighted low-rank approximation via alternating minimization with non-binary deterministic weights. It is a significant generalization of the results for matrix completion, the special case with binary weights, since our assumptions are similar or weaker than those made in existing works. Additionally, our proof applies for random initialization, unlike prior approaches that typically require an SVD-based initialization.
See: http://www.cs.rutgers.edu/~allender/theory-spring16.html