Contents Online

# Communications in Mathematical Sciences

## Volume 15 (2017)

### Number 2

### Minimization of transformed $l_1$ penalty: Closed form representation and iterative thresholding algorithms

Pages: 511 – 537

DOI: http://dx.doi.org/10.4310/CMS.2017.v15.n2.a9

#### Authors

#### Abstract

The transformed $l_1$ penalty ($\textrm{TL1}$) functions are a one parameter family of bilinear transformations composed with the absolute value function. When acting on vectors, the $\textrm{TL1}$ penalty interpolates $l_0$ and $l_1$ similar to $l_p$ norm, where $p$ is in $(0,1)$. In our companion paper, we showed that $\textrm{TL1}$ is a robust sparsity promoting penalty in compressed sensing (CS) problems for a broad range of incoherent and coherent sensing matrices. Here we develop an explicit fixed point representation for the $\textrm{TL1}$ regularized minimization problem. The $\textrm{TL1}$ thresholding functions are in closed form for all parameter values. In contrast, the lp thresholding functions ($p$ is in $[0,1]$) are in closed form only for $p=0,1,1/2,2/3$, known as hard, soft, half, and $2/3$ thresholding respectively. The $\textrm{TL1}$ threshold values differ in subcritical (supercritical) parameter regime where the $\textrm{TL1}$ threshold functions are continuous (discontinuous) similar to soft-thresholding (half-thresholding) functions. We propose $\textrm{TL1}$ iterative thresholding algorithms and compare them with hard and half thresholding algorithms in CS test problems. For both incoherent and coherent sensing matrices, a proposed $\textrm{TL1}$ iterative thresholding algorithm with adaptive subcritical and supercritical thresholds ($\textrm{TL1IT-s1}$ for short), consistently performs the best in sparse signal recovery with and without measurement noise.

#### Keywords

Transformed $l_1$ penalty, closed form thresholding functions, iterative thresholding algorithms, compressed sensing, robust recovery

#### 2010 Mathematics Subject Classification

94A12, 94A15

Published 21 February 2017