Convergence of ADMM for multi-block nonconvex separable optimization models

  • Ke Guo
  • , Deren Han*
  • , David Z.W. Wang
  • , Tingting Wu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

For solving minimization problems whose objective function is the sum of two functions without coupled variables and the constrained function is linear, the alternating direction method of multipliers (ADMM) has exhibited its efficiency and its convergence is well understood. When either the involved number of separable functions is more than two, or there is a nonconvex function, ADMM or its direct extended version may not converge. In this paper, we consider the multi-block separable optimization problems with linear constraints and absence of convexity of the involved component functions. Under the assumption that the associated function satisfies the Kurdyka- Lojasiewicz inequality, we prove that any cluster point of the iterative sequence generated by ADMM is a critical point, under the mild condition that the penalty parameter is sufficiently large. We also present some sufficient conditions guaranteeing the sublinear and linear rate of convergence of the algorithm.

Original languageEnglish
Pages (from-to)1139-1162
Number of pages24
JournalFrontiers of Mathematics in China
Volume12
Issue number5
DOIs
StatePublished - 1 Oct 2017
Externally publishedYes

Keywords

  • Kurdyka-Lojasiewicz inequality
  • Nonconvex optimization
  • alternating direction method of multipliers (ADMM)
  • separable structure

Fingerprint

Dive into the research topics of 'Convergence of ADMM for multi-block nonconvex separable optimization models'. Together they form a unique fingerprint.

Cite this