« search calendars« DIMACS/TRIPODS Workshop on Optimization and Machine Learning

« Maximizing Submodular Functions Exponentially Faster

Maximizing Submodular Functions Exponentially Faster

August 14, 2018, 9:00 AM - 9:30 AM

Location:

Iacocca Hall

Lehigh University

Bethlehem PA

Click here for map.

Yaron Singer, Harvard University

I'll describe a novel approach that yields algorithms whose parallel running time is exponentially faster than any previously known ones for submodular maximization. These algorithms reduce the number of parallel runtime from Omega(n) to O(log n) while retaining optimal approximation guarantees. Time permitting I'll discuss parallelization for convex optimization. In contrast to the submodular case, we show information theoretic lower bounds indicating that parallelization cannot accelerate (non-smooth) convex optimization.

Based on joint work with Eric Balkanski, Adam Breuer, and Aviad Rubinstein