G. Cormode and C. Dickens. Iterative hessian sketch in input sparsity time. In Proceedings of Beyond First Order Methods in ML (NeurIPS workshop), 2019.

Scalable algorithms to solve optimization and regression tasks even approximately, are needed to work with large datasets. In this paper we study efficient techniques from matrix sketching to solve a variety of convex constrained regression problems. We adopt “Iterative Hessian Sketching” (IHS) and show that the fast CountSketch and sparse Johnson-Lindenstrauss Transforms yield state-of-the-art accuracy guarantees under IHS, while dramatically improving the time cost. As a result, we obtain significantly faster algorithms for constrained regression, for both sparse and dense inputs. Our empirical results show that we can summarize data roughly 100x faster for sparse data, and, surprisingly, 10x faster on dense data! Consequently, solutions accurate to within machine precision of the optimal solution can be found much faster than the previous state of the art.

bib | Alternate Version | poster | .pdf ] Back


This file was generated by bibtex2html 1.92.