Improving Design Preference Prediction Accuracy Using Feature Learning

Alex Burnap, Yanxin Pan, Ye Liu, Yi Ren, Honglak Lee, Richard Gonzalez, Panos Y. Papalambros

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

Quantitative preference models are used to predict customer choices among design alternatives by collecting prior purchase data or survey answers. This paper examines how to improve the prediction accuracy of such models without collecting more data or changing the model. We propose to use features as an intermediary between the original customer-linked design variables and the preference model, transforming the original variables into a feature representation that captures the underlying design preference task more effectively. We apply this idea to automobile purchase decisions using three feature learning methods (principal component analysis (PCA), low rank and sparse matrix decomposition (LSD), and exponential sparse restricted Boltzmann machine (RBM)) and show that the use of features offers improvement in prediction accuracy using over 1 million real passenger vehicle purchase data. We then show that the interpretation and visualization of these feature representations may be used to help augment data-driven design decisions.

Original languageEnglish (US)
Article number071404
JournalJournal of Mechanical Design - Transactions of the ASME
Volume138
Issue number7
DOIs
StatePublished - Jul 1 2016

ASJC Scopus subject areas

  • Mechanics of Materials
  • Mechanical Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Improving Design Preference Prediction Accuracy Using Feature Learning'. Together they form a unique fingerprint.

Cite this