Abstract

In this paper, we propose a new end-to-end deep neural network model for time-series classification (TSC) with emphasis on both the accuracy and the interpretation. The proposed model contains a convolutional network component to extract high-level features and a recurrent network component to enhance the modeling of the temporal characteristics of TS data. In addition, a feedforward fully connected network with the sparse group lasso (SGL) regularization is used to generate the final classification. The proposed architecture not only achieves satisfying classification accuracy, but also obtains good interpretability through the SGL regularization. All these networks are connected and jointly trained in an end-to-end framework, and it can be generally applied to TSC tasks across different domains without the efforts of feature engineering. Our experiments in various TS data sets show that the proposed model outperforms the traditional convolutional neural network model for the classification accuracy, and also demonstrate how the SGL contributes to a better model interpretation.

Original languageEnglish (US)
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
StateAccepted/In press - Dec 7 2017

Fingerprint

Recurrent neural networks
Network components
Time series
Neural networks
Experiments

Keywords

  • Brain modeling
  • Convolutional neural network (CNN)
  • Data models
  • deep learning
  • Feature extraction
  • recurrent neural network (RNN)
  • Recurrent neural networks
  • regularization
  • sparse group lasso (SGL)
  • time-series classification (TSC).
  • Training

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this

@article{c15199ed707249ca8fff7083c8b63983,
title = "GCRNN: Group-Constrained Convolutional Recurrent Neural Network",
abstract = "In this paper, we propose a new end-to-end deep neural network model for time-series classification (TSC) with emphasis on both the accuracy and the interpretation. The proposed model contains a convolutional network component to extract high-level features and a recurrent network component to enhance the modeling of the temporal characteristics of TS data. In addition, a feedforward fully connected network with the sparse group lasso (SGL) regularization is used to generate the final classification. The proposed architecture not only achieves satisfying classification accuracy, but also obtains good interpretability through the SGL regularization. All these networks are connected and jointly trained in an end-to-end framework, and it can be generally applied to TSC tasks across different domains without the efforts of feature engineering. Our experiments in various TS data sets show that the proposed model outperforms the traditional convolutional neural network model for the classification accuracy, and also demonstrate how the SGL contributes to a better model interpretation.",
keywords = "Brain modeling, Convolutional neural network (CNN), Data models, deep learning, Feature extraction, recurrent neural network (RNN), Recurrent neural networks, regularization, sparse group lasso (SGL), time-series classification (TSC)., Training",
author = "Sangdi Lin and George Runger",
year = "2017",
month = "12",
day = "7",
doi = "10.1109/TNNLS.2017.2772336",
language = "English (US)",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",

}

TY - JOUR

T1 - GCRNN

T2 - Group-Constrained Convolutional Recurrent Neural Network

AU - Lin, Sangdi

AU - Runger, George

PY - 2017/12/7

Y1 - 2017/12/7

N2 - In this paper, we propose a new end-to-end deep neural network model for time-series classification (TSC) with emphasis on both the accuracy and the interpretation. The proposed model contains a convolutional network component to extract high-level features and a recurrent network component to enhance the modeling of the temporal characteristics of TS data. In addition, a feedforward fully connected network with the sparse group lasso (SGL) regularization is used to generate the final classification. The proposed architecture not only achieves satisfying classification accuracy, but also obtains good interpretability through the SGL regularization. All these networks are connected and jointly trained in an end-to-end framework, and it can be generally applied to TSC tasks across different domains without the efforts of feature engineering. Our experiments in various TS data sets show that the proposed model outperforms the traditional convolutional neural network model for the classification accuracy, and also demonstrate how the SGL contributes to a better model interpretation.

AB - In this paper, we propose a new end-to-end deep neural network model for time-series classification (TSC) with emphasis on both the accuracy and the interpretation. The proposed model contains a convolutional network component to extract high-level features and a recurrent network component to enhance the modeling of the temporal characteristics of TS data. In addition, a feedforward fully connected network with the sparse group lasso (SGL) regularization is used to generate the final classification. The proposed architecture not only achieves satisfying classification accuracy, but also obtains good interpretability through the SGL regularization. All these networks are connected and jointly trained in an end-to-end framework, and it can be generally applied to TSC tasks across different domains without the efforts of feature engineering. Our experiments in various TS data sets show that the proposed model outperforms the traditional convolutional neural network model for the classification accuracy, and also demonstrate how the SGL contributes to a better model interpretation.

KW - Brain modeling

KW - Convolutional neural network (CNN)

KW - Data models

KW - deep learning

KW - Feature extraction

KW - recurrent neural network (RNN)

KW - Recurrent neural networks

KW - regularization

KW - sparse group lasso (SGL)

KW - time-series classification (TSC).

KW - Training

UR - http://www.scopus.com/inward/record.url?scp=85038388246&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85038388246&partnerID=8YFLogxK

U2 - 10.1109/TNNLS.2017.2772336

DO - 10.1109/TNNLS.2017.2772336

M3 - Article

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

ER -