Strongly hierarchical factorization machines and ANOVA kernel regression

Ruocheng Guo, Hamidreza Alvari, Paulo Shakarian

Research output: Contribution to conferencePaperpeer-review

2 Scopus citations

Abstract

High-order parametric models that include terms for feature interactions are applied to various data mining tasks, where ground truth depends on interactions of features. However, with sparse data, the high-dimensional parameters for feature interactions often face three issues: expensive computation, difficulty in parameter estimation and lack of structure. Previous work has proposed approaches which can partially resolve the three issues. In particular, models with fac-torized parameters (e.g. Factorization Machines) and sparse learning algorithms (e.g. FTRL-Proximal) can tackle the first two issues but fail to address the third. Regarding to unstructured parameters, constraints or complicated regularization terms are applied such that hierarchical structures can be imposed. However, these methods make the optimization problem more challenging. In this work, we propose Strongly Hierarchical Factorization Machines and ANOVA kernel regression where all the three issues can be addressed without making the optimization problem more difficult. Experimental results show the proposed models significantly outperform the state-of-the-art in two data mining tasks: cold-start user response time prediction and stock volatility prediction.

Original languageEnglish (US)
Pages729-737
Number of pages9
DOIs
StatePublished - 2018
Event2018 SIAM International Conference on Data Mining, SDM 2018 - San Diego, United States
Duration: May 3 2018May 5 2018

Other

Other2018 SIAM International Conference on Data Mining, SDM 2018
Country/TerritoryUnited States
CitySan Diego
Period5/3/185/5/18

ASJC Scopus subject areas

  • Computer Science Applications
  • Software

Fingerprint

Dive into the research topics of 'Strongly hierarchical factorization machines and ANOVA kernel regression'. Together they form a unique fingerprint.

Cite this