September 17, 2019, 2:40 PM - 3:20 PM
Busch Campus Student Center
604 Bartholomew Rd
Click here for map.
Shannon McCurdy, Ancestry
Ridge regression is frequently used to regularize ill-posed linear least-squares problems. While ridge regression provides shrinkage for the regression coefficients, many of the coefficients remain small but non-zero. We provide a deterministic algorithm using ridge leverage scores that effectively forces coefficients to zero. This algorithm has a provable $(1+epsilon)$ bound on the statistical risk. As such, it is an interesting alternative to elastic net regularization. Our ridge regression bound follows from the properties of ridge leverage scores, which balance low-rank approximation and regularization. We also give provable guarantees for deterministic column sampling using ridge leverage scores. Like the randomized counterparts, the deterministic algorithm provides $(1+epsilon)$ error column subset selection, $(1+epsilon)$ error projection-cost preservation, and an additive-multiplicative spectral bound. We also show that under the assumption of power-law decay of ridge leverage scores, this deterministic algorithm is provably as accurate as randomized algorithms.