DIMACS Workshop on ADMM and Proximal Splitting Methods in Optimization

June 11 - 13, 2018
DIMACS Center, CoRE Building, Rutgers University

Organizers:
Farid Alizadeh, Rutgers University
Jonathan Eckstein, Rutgers University
Jean-Paul Watson, Sandia National Laboratory
David Woodruff, University of California, Davis

Workshop Announcement

In the past decade, the alternating direction method of multipliers (ADMM) and related algorithms have gained significant popularity for convex optimization problems arising from areas such as: machine learning and analysis of "big data"; image processing; and stochastic optimization. ADMM-based approaches to stochastic programming have recently been applied in forestry and electric power systems, and the widely known progressive hedging algorithm for stochastic programming may, in fact, be viewed as a special case of ADMM.

The ADMM is part of a large family of algorithms that use proximal (implicit gradient or augmented Lagrangian) steps in conjunction with some kind of decomposition procedure, a class which we may generically call proximal operator splitting methods. They are relatively easy to implement, especially in parallel computing environments. Many new variants of these methods have recently arisen, as have a plethora of convergence rate analyses. The traditional analysis of these algorithms depends on problem monotonicity, a property that generalizes standard convexity assumptions for optimization problems. Nevertheless, applications to nonconvex and mixed-integer problems have started appearing, with various levels of success. Often these applications are strictly heuristic, but in some cases they have been shown to yield useful bounding and relaxation information.

This workshop will bring together theoreticians studying proximal operator splitting algorithms with practitioners using such methods for real-world optimization problems. A particular but not exclusive focus will be problems with nonconvex structures such as integrality constraints. Topics may include theoretical and empirical convergence rate studies, computational experiments on real large-scale problems, asynchronous parallel implementation, and analyzing the validity and accuracy of solutions obtained in nonconvex settings. General goals of the workshop will be to make practitioners aware of the latest theoretical developments and algorithm variants, while exposing theoreticians to the most promising, interesting, timely, and challenging applications.


Next: Call for Participation
Workshop Index
DIMACS Homepage
Contacting the Center
Document last modified on November 15, 2017.