« search calendars« Theoretical Computer Science Seminar

« The Complexity of Dynamic Least-Squares Regression

The Complexity of Dynamic Least-Squares Regression

March 06, 2024, 11:00 AM - 12:00 PM

Location:

Conference Room 301

Rutgers University

CoRE Building

96 Frelinghuysen Road

Piscataway, NJ 08854

Shunhua Jiang, Columbia University

We settle the complexity of dynamic least-squares regression (LSR), where rows and labels (A^(t),b^(t)) can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an ε-approximate solution to min_{x^(t)} ||A^(t) x^(t) − b^(t)||_2 for all t ∈ [T]. We prove sharp separations (d^{2−o(1)} vs. ~d) between the amortized update time of: (i) Fully vs. Partially dynamic 0.01-LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting.

Our lower bounds follow from a gap-amplification reduction—reminiscent of iterative refinement—from the exact version of the Online Matrix Vector Conjecture (OMv), to constant approximate OMv over the reals, where the i-th online product Hv^(i) only needs to be computed to 0.1-relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation ε = n^{−ω(1)} (additive or multiplicative). This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.