Differentiable Linearized ADMM

We propose D-LADMM, which is a K-layer LADMM inspired deep neural network and rigorously prove that there exist a set of learnable parameters for D-LADMM to generate globally converged solutions.

Accelerated Alternating Direction Method of Multipliers: an Optimal O(1/K) Nonergodic Analysis

The Alternating Direction Method of Multipliers (ADMM) is widely used for linearly constrained convex problems. It is proven to have an O(1/√K) nonergodic convergence rateand a faster O(1/K) ergodic rate after ergodic averaging, where K is the number …

Construction of Incoherent Dictionaries via Direct Babel Function Minimization

We propose an augmented Lagrange multiplier based algorithm to solve this nonconvex and nonsmooth problem with the convergence guarantee that every accumulation point is a KKT point.

Nonconvex Sparse Spectral Clustering by Alternating Direction Method of Multipliers

We propose an efficient Alternating Direction Method of Multipliers (ADMM) to solve the nonconvex SSC and provide the convergence guarantee.

Globally Variance-Constrained Sparse Representation for Rate-Distortion Optimized Image Representation

A Globally Variance-Constrained Sparse Representation (GVCSR) model is proposed, where a variance-constrained rate term is introduced to the conventional sparse representation.