Abstract

Neural networks have become very popular in recent years, because of the astonishing success of deep learning in various domains such as image and speech recognition. In many of these domains, specifc architectures of neural networks, such as convolutional networks, seem to ft the particular structure of the problem domain very well and can therefore perform in an astonishingly effective way. However, the success of neural networks is not universal across all domains. Indeed, for learning problems without any special structure, or in cases where the data are somewhat limited, neural networks are known not to perform well with respect to traditional machine-learning methods such as random forests. In this article, we show that a carefully designed neural network with random forest structure can have better generalization ability. In fact, this architecture is more powerful than random forests, because the back-propagation algorithm reduces to a more powerful and generalized way of constructing a decision tree. Furthermore, the approach is efcient to train and requires a small constant factor of the number of training examples. This efciency allows the training of multiple neural networks to improve the generalization accuracy. Experimental results on real-world benchmark datasets demonstrate the effectiveness of the proposed enhancements for classifcation and regression.

Original languageEnglish (US)
Article numbera69
JournalACM Transactions on Intelligent Systems and Technology
Volume9
Issue number6
DOIs
StatePublished - Oct 1 2018

Fingerprint

Random Forest
Neural Networks
Neural networks
Image recognition
Image Recognition
Backpropagation algorithms
Back-propagation Algorithm
Speech Recognition
Decision trees
Speech recognition
Decision tree
Learning systems
Machine Learning
Enhancement
Regression
Benchmark
Experimental Results
Demonstrate

Keywords

  • Classifcation
  • Neural network
  • Random forest
  • Regression

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Artificial Intelligence

Cite this

Random-forest-inspired neural networks. / Wang, Suhang; Aggarwal, Charu; Liu, Huan.

In: ACM Transactions on Intelligent Systems and Technology, Vol. 9, No. 6, a69, 01.10.2018.

Research output: Contribution to journalArticle

@article{3f13b541ff6c4d7b9efe26a792586868,
title = "Random-forest-inspired neural networks",
abstract = "Neural networks have become very popular in recent years, because of the astonishing success of deep learning in various domains such as image and speech recognition. In many of these domains, specifc architectures of neural networks, such as convolutional networks, seem to ft the particular structure of the problem domain very well and can therefore perform in an astonishingly effective way. However, the success of neural networks is not universal across all domains. Indeed, for learning problems without any special structure, or in cases where the data are somewhat limited, neural networks are known not to perform well with respect to traditional machine-learning methods such as random forests. In this article, we show that a carefully designed neural network with random forest structure can have better generalization ability. In fact, this architecture is more powerful than random forests, because the back-propagation algorithm reduces to a more powerful and generalized way of constructing a decision tree. Furthermore, the approach is efcient to train and requires a small constant factor of the number of training examples. This efciency allows the training of multiple neural networks to improve the generalization accuracy. Experimental results on real-world benchmark datasets demonstrate the effectiveness of the proposed enhancements for classifcation and regression.",
keywords = "Classifcation, Neural network, Random forest, Regression",
author = "Suhang Wang and Charu Aggarwal and Huan Liu",
year = "2018",
month = "10",
day = "1",
doi = "10.1145/3232230",
language = "English (US)",
volume = "9",
journal = "ACM Transactions on Intelligent Systems and Technology",
issn = "2157-6904",
publisher = "Association for Computing Machinery (ACM)",
number = "6",

}

TY - JOUR

T1 - Random-forest-inspired neural networks

AU - Wang, Suhang

AU - Aggarwal, Charu

AU - Liu, Huan

PY - 2018/10/1

Y1 - 2018/10/1

N2 - Neural networks have become very popular in recent years, because of the astonishing success of deep learning in various domains such as image and speech recognition. In many of these domains, specifc architectures of neural networks, such as convolutional networks, seem to ft the particular structure of the problem domain very well and can therefore perform in an astonishingly effective way. However, the success of neural networks is not universal across all domains. Indeed, for learning problems without any special structure, or in cases where the data are somewhat limited, neural networks are known not to perform well with respect to traditional machine-learning methods such as random forests. In this article, we show that a carefully designed neural network with random forest structure can have better generalization ability. In fact, this architecture is more powerful than random forests, because the back-propagation algorithm reduces to a more powerful and generalized way of constructing a decision tree. Furthermore, the approach is efcient to train and requires a small constant factor of the number of training examples. This efciency allows the training of multiple neural networks to improve the generalization accuracy. Experimental results on real-world benchmark datasets demonstrate the effectiveness of the proposed enhancements for classifcation and regression.

AB - Neural networks have become very popular in recent years, because of the astonishing success of deep learning in various domains such as image and speech recognition. In many of these domains, specifc architectures of neural networks, such as convolutional networks, seem to ft the particular structure of the problem domain very well and can therefore perform in an astonishingly effective way. However, the success of neural networks is not universal across all domains. Indeed, for learning problems without any special structure, or in cases where the data are somewhat limited, neural networks are known not to perform well with respect to traditional machine-learning methods such as random forests. In this article, we show that a carefully designed neural network with random forest structure can have better generalization ability. In fact, this architecture is more powerful than random forests, because the back-propagation algorithm reduces to a more powerful and generalized way of constructing a decision tree. Furthermore, the approach is efcient to train and requires a small constant factor of the number of training examples. This efciency allows the training of multiple neural networks to improve the generalization accuracy. Experimental results on real-world benchmark datasets demonstrate the effectiveness of the proposed enhancements for classifcation and regression.

KW - Classifcation

KW - Neural network

KW - Random forest

KW - Regression

UR - http://www.scopus.com/inward/record.url?scp=85056450206&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056450206&partnerID=8YFLogxK

U2 - 10.1145/3232230

DO - 10.1145/3232230

M3 - Article

AN - SCOPUS:85056450206

VL - 9

JO - ACM Transactions on Intelligent Systems and Technology

JF - ACM Transactions on Intelligent Systems and Technology

SN - 2157-6904

IS - 6

M1 - a69

ER -