Statistics and Its Interface

Volume 6 (2013)

Number 2

High-dimensional regression and classification under a class of convex loss functions

Pages: 285 – 299

DOI: https://dx.doi.org/10.4310/SII.2013.v6.n2.a11

Authors

Yuan Jiang (Department of Statistics, Oregon State University, Corvallis, Or., U.S.A.)

Chunming Zhang (Department of Statistics, University of Wisconsin, Madison, Wisc., U.S.A.)

Abstract

The weighted $L_1$ penalty was used to revise the traditional Lasso in the linear regression model under quadratic loss. We make use of this penalty to investigate the highdimensional regression and classification under a wide class of convex loss functions. We show that for the dimension growing nearly exponentially with the sample size, the penalized estimator possesses the oracle property for suitable weights, and its induced classifier is shown to be consistent to the optimal Bayes rule. Moreover, we propose two methods, called componentwise regression (CR) and penalized componentwise regression (PCR), for estimating weights. Both theories and simulation studies provide supporting evidence for the advantage of PCR over CR in high-dimensional regression and classification. The effectiveness of the proposed method is illustrated using real data sets.

Keywords

convex loss, high-dimensional model, optimal Bayes rule, oracle property, weighted $L_1$ penalty

Published 10 May 2013