Statistics and Its Interface

Volume 2 (2009)

Number 4

Robust and sparse bridge regression

Pages: 481 – 491



Bin Li (Department of Experimental Statistics, Louisiana State University, Baton Rouge, La., U.S.A.)

Qingzhao Yu (School of Public Health, Louisiana State University Health Sciences Center, New Orleans, La., U.S.A.)


It is known that when there are heavy-tailed errors or outliers in the response, the least squares methods may fail to produce a reliable estimator. In this paper, we proposed a generalized Huber criterion which is highly flexible and robust for large errors. We applied the new criterion to the bridge regression family, called robust and sparse bridge regression (RSBR). However, to get the RSBR solution requires solving a nonconvex minimization problem, which is a computational challenge. On the basis of recent advances in difference convex programming, coordinate descent algorithm and local linear approximation, we provide an efficient computational algorithm that attempts to solve this nonconvex problem. Numerical examples show the proposed RSBR algorithm performs well and suitable for large-scale problems.


coordinate descent, D.C. programming, Huber loss, local linear approximation, regularization

Published 1 January 2009