« DIMACS Workshop on ADMM and Proximal Splitting Methods in Optimization
June 11, 2018 - June 13, 2018
Location:
DIMACS Center
Rutgers University
CoRE Building
96 Frelinghuysen Road
Piscataway, NJ 08854
Click here for map.
Organizer(s):
Farid Alizadeh, Rutgers University
Jonathan Eckstein, Rutgers University
Jean-Paul Watson, Sandia National Laboratories
David L. Woodruff, University of California, Davis
In the past decade, the alternating direction method of multipliers (ADMM) and related algorithms have gained significant popularity for convex optimization problems arising from areas such as: machine learning and analysis of “big data”; image processing; and stochastic optimization. ADMM-based approaches to stochastic programming have recently been applied in forestry and electric power systems, and the widely known progressive hedging algorithm for stochastic programming may, in fact, be viewed as a special case of ADMM.
The ADMM is part of a large family of algorithms that use proximal (implicit gradient or augmented Lagrangian) steps in conjunction with some kind of decomposition procedure, a class which we may generically call proximal operator splitting methods. They are relatively easy to implement, especially in parallel computing environments. Many new variants of these methods have recently arisen, as have a plethora of convergence rate analyses. The traditional analysis of these algorithms depends on problem monotonicity, a property that generalizes standard convexity assumptions for optimization problems. Nevertheless, applications to nonconvex and mixed-integer problems have started appearing, with various levels of success. Often these applications are strictly heuristic, but in some cases they have been shown to yield useful bounding and relaxation information.
This workshop will bring together theoreticians studying proximal operator splitting algorithms with practitioners using such methods for real-world optimization problems. A particular but not exclusive focus will be problems with nonconvex structures such as integrality constraints. Topics may include theoretical and empirical convergence rate studies, computational experiments on real large-scale problems, asynchronous parallel implementation, and analyzing the validity and accuracy of solutions obtained in nonconvex settings. General goals of the workshop will be to make practitioners aware of the latest theoretical developments and algorithm variants, while exposing theoreticians to the most promising, interesting, timely, and challenging applications.
Monday, June 11, 2018
Breakfast & Check in
Welcome by Organizers
Jonathan Eckstein, Rutgers University
The ADMM, Progressive Hedging, and Operator Splitting
Jonathan Eckstein, Rutgers University
Panos Patrinos, Katholieke Universiteit Leuven
Kim-Chuan Toh, National University of Singapore
Break
Decentralized Generation Scheduling in Energy Networks
Shabbir Ahmed, Georgia Institute of Technology
Katya Scheinberg, Lehigh University
Lunch
ADMM for Multiaffine Constrained Optimization
Don Goldfarb, Columbia University
On Solving the Quadratic Shortest Path Problem
Renata Sotirov, Tilburg University
Proximal Methods for Conic Optimization over Nonnegative Trigonometric Polynomials
Lieven Vandenberghe, University of California, Los Angeles
Break
Progressive Hedging for Mixed-Integer and Non-Convex Problems: A View from the Trenches
Jean-Paul Watson, Sandia National Laboratories
Jim Luedtke, University of Wisconsin, Madison
Tuesday, June 12, 2018
Breakfast & Check in
DIMACS Welcome
Tamra Carpenter, DIMACS
ADMM, Accelerated-ADMM, and Continuous Dynamical Systems
Daniel Robinson, Johns Hopkins University
Relaxed Inertial Proximal Algorithms for Monotone Inclusions
Hedy Attouch, University of Montpellier
An Accelerated Primal-dual Algorithm for General Convex-Concave Saddle Point Problems
Serhat Aybat, Pennsylvania State University
Break
Radu Bot, University of Vienna
Augmented Lagrangians and Decomposition in Convex and Nonconvex Programming
Terry Rockafellar, University of Washington
Lunch
On the Convergence and Complexity of Nonconvex ADMM
Shiqian Ma, University of California, Davis
Ernest Ryu, University of California, Los Angeles
On Linear Convergence for Douglas-Rachford Splitting and ADMM
Pontus Gisselson, Lund University
Break
Projective Splitting with Forward Steps: Asynchronous and Block-Iterative Operator Splitting
Patrick Johnstone, Rutgers University
Computational Experience with Asynchronous Projective Hedging
David L. Woodruff, University of California, Davis
Workshop Dinner at Old Man Rafferty's
Wednesday, June 13, 2018
Breakfast & Check-in
A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization
Maicon Alves, Federal University of Santa Catarina
Selective Linearization for Multi-block Statistical Learning Problems
Yu Du, University of Colorado, Denver
Solving ADMM Subproblems using Relative Error Criteria
Jefferson Melo, Federal University of Goiás
Break
Douglas-Rachford Splitting for Pathological Problems
Wotao Yin, University of California, Los Angeles
On the Order of the Operators in the Douglas-Rachford Algorithm
Walaa Moursi, Stanford University
Lunch
Parallel Schur-complement and ADMM Decomposition Strategies for Dynamic Optimization Problems
John Siirola, Sandia National Laboratories
On the Equivalence of Inexact Proximal ALM and ADMM for a Class of Convex Composite Programming
Defeng Sun, Hong Kong Polytechnic University
Source Separation in Astronomy with Constrained Matrix Factorization
Peter Melchior, Princeton University
Presentations are by invitation. Attendance at the workshop is open to all interested participants (subject to space limitations). Please register if you would like to attend this workshop.
Presented in association with the Special Focus on Bridging Continuous and Discrete Optimization.