Layer winner-take-all neural networks based on existing competitive structures

Chi Ming Chen, Jar-Ferr Yang

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In this paper, we propose generalized layer winner-take-all (WTA) neural networks based on the suggested full WTA networks, which can be extended from any existing WTA structure with a simple weighted-and-sum neuron. With modular regularity and local connection, the layer WTA network in either hierarchical or recursive structure is suitable for a large number of competitors. The complexity and convergence performances of layer and direct WTA neural networks are analyzed. Simulation results and theoretical analyzes verify that the layer WTA neural networks with extendibility outperform their original direct WTA structures in aspects of low complexity and fast convergence.

Original languageEnglish
Pages (from-to)25-30
Number of pages6
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume30
Issue number1
DOIs
Publication statusPublished - 2000 Jan 1

Fingerprint

Neural networks
Neurons

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Medicine(all)
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this

@article{a857f143ee524721a17b1f6f832a6010,
title = "Layer winner-take-all neural networks based on existing competitive structures",
abstract = "In this paper, we propose generalized layer winner-take-all (WTA) neural networks based on the suggested full WTA networks, which can be extended from any existing WTA structure with a simple weighted-and-sum neuron. With modular regularity and local connection, the layer WTA network in either hierarchical or recursive structure is suitable for a large number of competitors. The complexity and convergence performances of layer and direct WTA neural networks are analyzed. Simulation results and theoretical analyzes verify that the layer WTA neural networks with extendibility outperform their original direct WTA structures in aspects of low complexity and fast convergence.",
author = "Chen, {Chi Ming} and Jar-Ferr Yang",
year = "2000",
month = "1",
day = "1",
doi = "10.1109/3477.826944",
language = "English",
volume = "30",
pages = "25--30",
journal = "IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics",
issn = "1083-4419",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "1",

}

Layer winner-take-all neural networks based on existing competitive structures. / Chen, Chi Ming; Yang, Jar-Ferr.

In: IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, Vol. 30, No. 1, 01.01.2000, p. 25-30.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Layer winner-take-all neural networks based on existing competitive structures

AU - Chen, Chi Ming

AU - Yang, Jar-Ferr

PY - 2000/1/1

Y1 - 2000/1/1

N2 - In this paper, we propose generalized layer winner-take-all (WTA) neural networks based on the suggested full WTA networks, which can be extended from any existing WTA structure with a simple weighted-and-sum neuron. With modular regularity and local connection, the layer WTA network in either hierarchical or recursive structure is suitable for a large number of competitors. The complexity and convergence performances of layer and direct WTA neural networks are analyzed. Simulation results and theoretical analyzes verify that the layer WTA neural networks with extendibility outperform their original direct WTA structures in aspects of low complexity and fast convergence.

AB - In this paper, we propose generalized layer winner-take-all (WTA) neural networks based on the suggested full WTA networks, which can be extended from any existing WTA structure with a simple weighted-and-sum neuron. With modular regularity and local connection, the layer WTA network in either hierarchical or recursive structure is suitable for a large number of competitors. The complexity and convergence performances of layer and direct WTA neural networks are analyzed. Simulation results and theoretical analyzes verify that the layer WTA neural networks with extendibility outperform their original direct WTA structures in aspects of low complexity and fast convergence.

UR - http://www.scopus.com/inward/record.url?scp=0033885424&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033885424&partnerID=8YFLogxK

U2 - 10.1109/3477.826944

DO - 10.1109/3477.826944

M3 - Article

VL - 30

SP - 25

EP - 30

JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics

JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics

SN - 1083-4419

IS - 1

ER -