Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Performance comparison of proximal methods for regression with nonsmooth regularizers on real datasets

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

First order methods are known to be effective for high-dimensional machine learning problems due to their faster convergence and low per-iteration-complexity. In machine learning, many problems are designed as a convex minimization problem with smooth loss function and non-smooth regularizers. Learning with sparsity-inducing regularizers belongs to this class of problems, where a number of first order methods are already available in the literature of optimization and machine learning theory. Proximal methods also come under the class of first-order methods and lead to better sparse models. In this paper, we discuss three state-of-the-art proximal methods for the problem of regression, when the loss minimization is associated with a sparsity-inducing regularizer. This paper presents for the first time their comparison based on practical convergence rates, prediction accuracy and consumed CPU time on six real datasets. © 2016 IEEE.

Description

Keywords

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By