Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Performance comparison of proximal methods for regression with nonsmooth regularizers on real datasets

dc.contributor.authorVerma M.; Shukla K.K.
dc.date.accessioned2025-05-24T09:27:25Z
dc.description.abstractFirst order methods are known to be effective for high-dimensional machine learning problems due to their faster convergence and low per-iteration-complexity. In machine learning, many problems are designed as a convex minimization problem with smooth loss function and non-smooth regularizers. Learning with sparsity-inducing regularizers belongs to this class of problems, where a number of first order methods are already available in the literature of optimization and machine learning theory. Proximal methods also come under the class of first-order methods and lead to better sparse models. In this paper, we discuss three state-of-the-art proximal methods for the problem of regression, when the loss minimization is associated with a sparsity-inducing regularizer. This paper presents for the first time their comparison based on practical convergence rates, prediction accuracy and consumed CPU time on six real datasets. © 2016 IEEE.
dc.identifier.doihttps://doi.org/10.1109/ICACCI.2016.7732086
dc.identifier.urihttp://172.23.0.11:4000/handle/123456789/16161
dc.relation.ispartofseries2016 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2016
dc.titlePerformance comparison of proximal methods for regression with nonsmooth regularizers on real datasets

Files

Collections