Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint

Ji Liu, Ryohei Fujimaki, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)

Abstract

2014 We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the Forward-Backward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBa-obj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k log d where k is the spar-sity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods based on forward greedy selection and Ll-regularization.

Original languageEnglish (US)
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages765-773
Number of pages9
Volume1
ISBN (Print)9781634393973
StatePublished - 2014
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: Jun 21 2014Jun 26 2014

Other

Other31st International Conference on Machine Learning, ICML 2014
CountryChina
CityBeijing
Period6/21/146/26/14

Fingerprint

Feature extraction
Sensors

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Cite this

Liu, J., Fujimaki, R., & Ye, J. (2014). Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint. In 31st International Conference on Machine Learning, ICML 2014 (Vol. 1, pp. 765-773). International Machine Learning Society (IMLS).

Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint. / Liu, Ji; Fujimaki, Ryohei; Ye, Jieping.

31st International Conference on Machine Learning, ICML 2014. Vol. 1 International Machine Learning Society (IMLS), 2014. p. 765-773.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Liu, J, Fujimaki, R & Ye, J 2014, Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint. in 31st International Conference on Machine Learning, ICML 2014. vol. 1, International Machine Learning Society (IMLS), pp. 765-773, 31st International Conference on Machine Learning, ICML 2014, Beijing, China, 6/21/14.
Liu J, Fujimaki R, Ye J. Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint. In 31st International Conference on Machine Learning, ICML 2014. Vol. 1. International Machine Learning Society (IMLS). 2014. p. 765-773
Liu, Ji ; Fujimaki, Ryohei ; Ye, Jieping. / Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint. 31st International Conference on Machine Learning, ICML 2014. Vol. 1 International Machine Learning Society (IMLS), 2014. pp. 765-773
@inproceedings{b1bf31106c0f4b7fbd4b742a34707d0c,
title = "Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint",
abstract = "2014 We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the Forward-Backward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBa-obj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k log d where k is the spar-sity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods based on forward greedy selection and Ll-regularization.",
author = "Ji Liu and Ryohei Fujimaki and Jieping Ye",
year = "2014",
language = "English (US)",
isbn = "9781634393973",
volume = "1",
pages = "765--773",
booktitle = "31st International Conference on Machine Learning, ICML 2014",
publisher = "International Machine Learning Society (IMLS)",

}

TY - GEN

T1 - Forward-backward greedy algorithms for general convex smooth functions over a cardinality constraint

AU - Liu, Ji

AU - Fujimaki, Ryohei

AU - Ye, Jieping

PY - 2014

Y1 - 2014

N2 - 2014 We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the Forward-Backward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBa-obj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k log d where k is the spar-sity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods based on forward greedy selection and Ll-regularization.

AB - 2014 We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the Forward-Backward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBa-obj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k log d where k is the spar-sity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods based on forward greedy selection and Ll-regularization.

UR - http://www.scopus.com/inward/record.url?scp=84919783289&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84919783289&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781634393973

VL - 1

SP - 765

EP - 773

BT - 31st International Conference on Machine Learning, ICML 2014

PB - International Machine Learning Society (IMLS)

ER -