Epoch gradient descent for smoothed hinge-loss linear SVMs

Soomin Lee, Angelia Nedich

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A gradient descent method for strongly convex problems with Lipschitz continuous gradients requires only O(logq ε) iterations to obtain an ε-accurate solution (q is a constant in (0; 1)). Support Vector Machines (SVMs) penalized with the popular hinge-loss are strongly convex but they do not have Lipschitz continuous gradient. We find SVMs with strong-convexity and Lipschitz continuous gradient using Nesterov's smooth approximation technique [1]. The simple gradient method applied on the smoothed SVM converges fast but the obtained solution is not the exact maximum margin separating hyperplane. To obtain an exact solution, as well as a fast convergence, we propose a hybrid approach, epoch gradient descent.

Original languageEnglish (US)
Title of host publication2013 American Control Conference, ACC 2013
Pages4789-4794
Number of pages6
StatePublished - 2013
Externally publishedYes
Event2013 1st American Control Conference, ACC 2013 - Washington, DC, United States
Duration: Jun 17 2013Jun 19 2013

Other

Other2013 1st American Control Conference, ACC 2013
CountryUnited States
CityWashington, DC
Period6/17/136/19/13

Fingerprint

Hinges
Support vector machines
Gradient methods

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

Lee, S., & Nedich, A. (2013). Epoch gradient descent for smoothed hinge-loss linear SVMs. In 2013 American Control Conference, ACC 2013 (pp. 4789-4794). [6580579]

Epoch gradient descent for smoothed hinge-loss linear SVMs. / Lee, Soomin; Nedich, Angelia.

2013 American Control Conference, ACC 2013. 2013. p. 4789-4794 6580579.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Lee, S & Nedich, A 2013, Epoch gradient descent for smoothed hinge-loss linear SVMs. in 2013 American Control Conference, ACC 2013., 6580579, pp. 4789-4794, 2013 1st American Control Conference, ACC 2013, Washington, DC, United States, 6/17/13.
Lee S, Nedich A. Epoch gradient descent for smoothed hinge-loss linear SVMs. In 2013 American Control Conference, ACC 2013. 2013. p. 4789-4794. 6580579
Lee, Soomin ; Nedich, Angelia. / Epoch gradient descent for smoothed hinge-loss linear SVMs. 2013 American Control Conference, ACC 2013. 2013. pp. 4789-4794
@inproceedings{5065239e281443ce8f0363439840ca73,
title = "Epoch gradient descent for smoothed hinge-loss linear SVMs",
abstract = "A gradient descent method for strongly convex problems with Lipschitz continuous gradients requires only O(logq ε) iterations to obtain an ε-accurate solution (q is a constant in (0; 1)). Support Vector Machines (SVMs) penalized with the popular hinge-loss are strongly convex but they do not have Lipschitz continuous gradient. We find SVMs with strong-convexity and Lipschitz continuous gradient using Nesterov's smooth approximation technique [1]. The simple gradient method applied on the smoothed SVM converges fast but the obtained solution is not the exact maximum margin separating hyperplane. To obtain an exact solution, as well as a fast convergence, we propose a hybrid approach, epoch gradient descent.",
author = "Soomin Lee and Angelia Nedich",
year = "2013",
language = "English (US)",
isbn = "9781479901777",
pages = "4789--4794",
booktitle = "2013 American Control Conference, ACC 2013",

}

TY - GEN

T1 - Epoch gradient descent for smoothed hinge-loss linear SVMs

AU - Lee, Soomin

AU - Nedich, Angelia

PY - 2013

Y1 - 2013

N2 - A gradient descent method for strongly convex problems with Lipschitz continuous gradients requires only O(logq ε) iterations to obtain an ε-accurate solution (q is a constant in (0; 1)). Support Vector Machines (SVMs) penalized with the popular hinge-loss are strongly convex but they do not have Lipschitz continuous gradient. We find SVMs with strong-convexity and Lipschitz continuous gradient using Nesterov's smooth approximation technique [1]. The simple gradient method applied on the smoothed SVM converges fast but the obtained solution is not the exact maximum margin separating hyperplane. To obtain an exact solution, as well as a fast convergence, we propose a hybrid approach, epoch gradient descent.

AB - A gradient descent method for strongly convex problems with Lipschitz continuous gradients requires only O(logq ε) iterations to obtain an ε-accurate solution (q is a constant in (0; 1)). Support Vector Machines (SVMs) penalized with the popular hinge-loss are strongly convex but they do not have Lipschitz continuous gradient. We find SVMs with strong-convexity and Lipschitz continuous gradient using Nesterov's smooth approximation technique [1]. The simple gradient method applied on the smoothed SVM converges fast but the obtained solution is not the exact maximum margin separating hyperplane. To obtain an exact solution, as well as a fast convergence, we propose a hybrid approach, epoch gradient descent.

UR - http://www.scopus.com/inward/record.url?scp=84883533608&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84883533608&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781479901777

SP - 4789

EP - 4794

BT - 2013 American Control Conference, ACC 2013

ER -