Dynamic topology representing networks

Jennie Si, S. Lin, M. A. Vuong

Research output: Contribution to journalArticle

26 Citations (Scopus)

Abstract

In the present paper, we propose a new algorithm, namely the Dynamic Topology Representing Networks (DTRN) for learning both topology and clustering information from input data. In contrast to other models with adaptive architecture of this kind, the DTRN algorithm adaptively grows the number of output nodes by applying a vigilance test. The clustering procedure is based on a winner-take-quota learning strategy in conjunction with an annealing process in order to minimize the associated mean square error. A competitive Hebbian rule is applied to learn the global topology information concurrently with the clustering process. The topology information learned is also utilized for dynamically deleting the nodes and for the annealing process. Properties of the DTRN algorithm will be discussed. Extensive simulations will be provided to characterize the effectiveness of the new algorithm in topology preserving, learning speed, and classification tasks as compared to other algorithms of the same nature. (C) 2000 Elsevier Science Ltd.

Original languageEnglish (US)
Pages (from-to)617-627
Number of pages11
JournalNeural Networks
Volume13
Issue number6
DOIs
StatePublished - Jul 2000

Fingerprint

Topology
Cluster Analysis
Learning
Annealing
Mean square error

Keywords

  • Adaptive vector quantization
  • Clustering
  • Self-organization
  • Topology preserving
  • Unsupervised learning

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Dynamic topology representing networks. / Si, Jennie; Lin, S.; Vuong, M. A.

In: Neural Networks, Vol. 13, No. 6, 07.2000, p. 617-627.

Research output: Contribution to journalArticle

Si, Jennie ; Lin, S. ; Vuong, M. A. / Dynamic topology representing networks. In: Neural Networks. 2000 ; Vol. 13, No. 6. pp. 617-627.
@article{97b8c592c1da463ea1a08f57b4bd6125,
title = "Dynamic topology representing networks",
abstract = "In the present paper, we propose a new algorithm, namely the Dynamic Topology Representing Networks (DTRN) for learning both topology and clustering information from input data. In contrast to other models with adaptive architecture of this kind, the DTRN algorithm adaptively grows the number of output nodes by applying a vigilance test. The clustering procedure is based on a winner-take-quota learning strategy in conjunction with an annealing process in order to minimize the associated mean square error. A competitive Hebbian rule is applied to learn the global topology information concurrently with the clustering process. The topology information learned is also utilized for dynamically deleting the nodes and for the annealing process. Properties of the DTRN algorithm will be discussed. Extensive simulations will be provided to characterize the effectiveness of the new algorithm in topology preserving, learning speed, and classification tasks as compared to other algorithms of the same nature. (C) 2000 Elsevier Science Ltd.",
keywords = "Adaptive vector quantization, Clustering, Self-organization, Topology preserving, Unsupervised learning",
author = "Jennie Si and S. Lin and Vuong, {M. A.}",
year = "2000",
month = "7",
doi = "10.1016/S0893-6080(00)00039-3",
language = "English (US)",
volume = "13",
pages = "617--627",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",
number = "6",

}

TY - JOUR

T1 - Dynamic topology representing networks

AU - Si, Jennie

AU - Lin, S.

AU - Vuong, M. A.

PY - 2000/7

Y1 - 2000/7

N2 - In the present paper, we propose a new algorithm, namely the Dynamic Topology Representing Networks (DTRN) for learning both topology and clustering information from input data. In contrast to other models with adaptive architecture of this kind, the DTRN algorithm adaptively grows the number of output nodes by applying a vigilance test. The clustering procedure is based on a winner-take-quota learning strategy in conjunction with an annealing process in order to minimize the associated mean square error. A competitive Hebbian rule is applied to learn the global topology information concurrently with the clustering process. The topology information learned is also utilized for dynamically deleting the nodes and for the annealing process. Properties of the DTRN algorithm will be discussed. Extensive simulations will be provided to characterize the effectiveness of the new algorithm in topology preserving, learning speed, and classification tasks as compared to other algorithms of the same nature. (C) 2000 Elsevier Science Ltd.

AB - In the present paper, we propose a new algorithm, namely the Dynamic Topology Representing Networks (DTRN) for learning both topology and clustering information from input data. In contrast to other models with adaptive architecture of this kind, the DTRN algorithm adaptively grows the number of output nodes by applying a vigilance test. The clustering procedure is based on a winner-take-quota learning strategy in conjunction with an annealing process in order to minimize the associated mean square error. A competitive Hebbian rule is applied to learn the global topology information concurrently with the clustering process. The topology information learned is also utilized for dynamically deleting the nodes and for the annealing process. Properties of the DTRN algorithm will be discussed. Extensive simulations will be provided to characterize the effectiveness of the new algorithm in topology preserving, learning speed, and classification tasks as compared to other algorithms of the same nature. (C) 2000 Elsevier Science Ltd.

KW - Adaptive vector quantization

KW - Clustering

KW - Self-organization

KW - Topology preserving

KW - Unsupervised learning

UR - http://www.scopus.com/inward/record.url?scp=0034233339&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034233339&partnerID=8YFLogxK

U2 - 10.1016/S0893-6080(00)00039-3

DO - 10.1016/S0893-6080(00)00039-3

M3 - Article

C2 - 10987515

AN - SCOPUS:0034233339

VL - 13

SP - 617

EP - 627

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 6

ER -