DIMACS Theoretical Computer Science Seminar


Title: Sparser Johnson-Lindenstrauss Transforms

Speaker: Jelani Nelson, Princeton University and the Institute for Advanced Study

Date: Wednesday, March 21, 2012 11:00-12:00pm

Location: DIMACS Center, CoRE Bldg, Room 431, Rutgers University, Busch Campus, Piscataway, NJ


Abstract:

The Johnson-Lindenstrauss (JL) lemma (1984) states that any n points in d-dimensional Euclidean space can be embedded into k = O((log n)/eps^2) dimensions so that all pairwise distances are preserved up to 1+/-eps. Furthermore, this embedding can be achieved via a linear mapping. The JL lemma is a useful tool for speeding up solutions to several high-dimensional problems: closest pair, nearest neighbor, diameter, minimum spanning tree, etc. It also speeds up some clustering and string processing algorithms, has connections to RIP_2 matrices from compressed sensing, reduces the amount of storage required to store a dataset, and can be used to reduce memory required for numerical linear algebra problems such as linear regression and low rank approximation.

The early proofs of the JL lemma let the linear mapping be specified by a random dense k x d matrix (e.g. i.i.d. Gaussian entries). Thus, performing an embedding required dense matrix-vector multiplication. We give a construction of linear mappings for JL in which only an O(eps)-fraction of the embedding matrix is non-zero, thus speeding up the embedding time. Previous constructions only achieved sparse embedding matrices for 1/eps >> log n.

This is joint work with Daniel Kane (Stanford).

See: http://paul.rutgers.edu/~yixinxu/theory-spring12.html