Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems

Xiaocong Du, Bhargav Bhushanam, Jiecao Yu, Dhruv Choudhary, Tianxiang Gao, Sherman Wong, Louis Feng, Jongsoo Park, Yu Cao, Arun Kejariwal

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Deep learning recommendation systems at scale have provided remarkable gains through increasing model capacity (i.e. wider and deeper neural networks), but it comes at significant training cost and infrastructure cost. Model pruning is an effective technique to reduce computation overhead for deep neural networks by removing redundant parameters. However, modern recommendation systems are still thirsty for model capacity due to the demand for handling big data. Thus, pruning a recommendation model at scale results in a smaller model capacity and consequently lower accuracy. To reduce computation cost without sacrificing model capacity, we propose a dynamic training scheme, namely alternate model growth and pruning, to alternatively construct and prune weights in the course of training. Our method leverages structured sparsification to reduce computational cost without hurting the model capacity at the end of offline training so that a full-size model is available in the recurring training stage to learn new data in real time. To the best of our knowledge, this is the first work to provide in-depth experiments and discussion of applying structural dynamics to recommendation systems at scale to reduce training cost. The proposed method is validated with an open-source deep learning recommendation model (DLRM) and state-of-the-art industrial-scale production models.

Original languageEnglish (US)
Title of host publicationProceedings - 20th IEEE International Conference on Machine Learning and Applications, ICMLA 2021
EditorsM. Arif Wani, Ishwar K. Sethi, Weisong Shi, Guangzhi Qu, Daniela Stan Raicu, Ruoming Jin
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1421-1428
Number of pages8
ISBN (Electronic)9781665443371
DOIs
StatePublished - 2021
Event20th IEEE International Conference on Machine Learning and Applications, ICMLA 2021 - Virtual, Online, United States
Duration: Dec 13 2021Dec 16 2021

Publication series

NameProceedings - 20th IEEE International Conference on Machine Learning and Applications, ICMLA 2021

Conference

Conference20th IEEE International Conference on Machine Learning and Applications, ICMLA 2021
Country/TerritoryUnited States
CityVirtual, Online
Period12/13/2112/16/21

Keywords

  • Deep learning
  • Deep neural network
  • Efficient model training
  • Model growth
  • Model pruning
  • Recommendation system

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Health Informatics
  • Artificial Intelligence
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems'. Together they form a unique fingerprint.

Cite this