« search calendars« CoSP Workshop and School on Algorithms and Complexity

« How Well can Simple Dynamic Programs be Approximated?

How Well can Simple Dynamic Programs be Approximated?

July 16, 2019, 9:30 AM - 11:00 AM

Location:

Hill Center, Room 116

Rutgers University

Mike Saks, Rutgers University

In many of the simplest examples of dynamic programming, inputs of size n are processed by constructing an n by n matrix, where each entry is obtained by a simple function of a few entries above and to the left. This yields a simple O(n^2) algorithm for such problems. These algorithms naturally arise, for example, in evaluating various distance measures between two strings, such as LCS (longest common subsequence) distance, Edit Distance, Frechet Distance, and Dynamic Time Warping Distance, and the i,j entry of the matrix gives the desired measure between the length i prefix of the first string, and the length j prefix of the second. With few exceptions (such as the Longest Increasing Subsequence (LIS) problem where the quadratic time algorithm has been improved to O(n log(n)), these quadratic time dynamic programming algorithms remain essentially the fastest exact algorithms (except for n^o(1) factor improvements). This phenomenon has been the focus of much recent research in fine grain complexity, and there is a line of work that shows that for many such problems, reducing the running time to O(n^{2-a}) for some a>0 would contradict the Strong Exponential Time Hypothesis. This suggests that it will be difficult, if not impossible, to find truly subquadratic time algorithms for these problems.

However, if we are willing to accept a good approximation (rather than the exact answer), then there has been significant progress in obtaining subquadratic algorithms on both the LIS and Edit Distance problems. This talk will survey some of this work.

 

Video This is really a "voice over slides recording". You can hear Mike and see his slides, but you (mostly) can't see Mike.