Speaker:Xiangxiong Zhang, Purdue University
Title: The asymptotic convergence rate of the Douglas Rachford iteration for basis pursuit
Time:2016-12-20, 10:00-11:00
Venue: Room 316 of Environmental Building, Renmin University
Abstract: For large scale nonsmooth convex optimization problems, first order methods involving only the subgradients are usually used thanks to their scalability to the problem size. Douglas-Rachford (DR) splitting is one of the most popular first order methods in practice. It is well-known that DR applied on dual problem is equivalent to the widely used alternating direction method of multipliers (ADMM) in nonlinear mechanics and the split Bregman method in image processing community. As motivating examples, first we will briefly review several famous convex recovery results including compressive sensing, matrix completion and PhaseLift, which represent a successful story of the convex relaxation approach attacking certain NP-hard linear inverse problems in the last decade. When DR is applied to these convex optimization problems, one interesting question of practical use is how the parameters in DR affect the performance. We will show an explicit formula of the sharp asymptotic convergence rate of DR for the simple L1 minimization. The analysis will be verified on examples of processing seismic data in Curvetlet domain. This is a joint work with Prof. Laurent Demanet at MIT.