Contents Online
Methods and Applications of Analysis
Volume 26 (2019)
Number 3
Special Issue in Honor of Roland Glowinski (Part 2 of 2)
Guest Editors: Xiaoping Wang (Hong Kong University of Science and Technology) and Xiaoming Yuan (The University of Hong Kong)
A Douglas–Rachford method for sparse extreme learning machine
Pages: 217 – 234
DOI: https://dx.doi.org/10.4310/MAA.2019.v26.n3.a1
Authors
Abstract
Operator-splitting methods have gained popularity in various areas of computational sciences, including machine learning. In this article, we present a novel nonsmooth and nonconvex formulation and its efficient associated solution algorithm to derive a sparse predictive machine learning model. The model structure is based on the so-called extreme learning machine with randomly generated basis. Our computational experiments confirm the efficiency of the proposed method, when a bold selection of the timestep is made. Comparative tests also indicate interesting results concerning the use of the $l_0$ seminorm for ultimate sparsity.
Keywords
operator-splitting, Douglas–Rachford, extreme learning machine, sparse regularization
2010 Mathematics Subject Classification
90C26
Received 21 December 2017
Accepted 12 April 2019
Published 2 April 2020