Quantization effects of Hebbian-Type associative memories

Pau-Choo Chung, Yi Nung Chung, Ching Tsorng Tsai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Effects of quantization strategies in Hebbian-Type associative memories are exploded in this paper. The quantization strategies considered include two-level, three-level strategy with a cut-off threshold, and linear quantizations. The two-level strategy is to clip positive interconnections into +1 and negative interconnections into -1. The three-level quantization uses the same strategy in turning the interconnections into +1 or -1. except that it is applied only to those interconnections having their values larger than a cut-off threshold. Those interconnections within the cutoff threshold are then set to zero. Results indicate that three-level quantization with a properly selected cut-off threshold gives network higher performance than two-level quantization. The performance of a network with linear quantization is also compared with a network with three-level quantization. It is also found that the linear quantization, although it preserves more network interconnections, does not significantly enhance network performance compared with the three-level quantization. Hence, it is concluded that the three-level binary quantization with an optimal threshold is a better choice when network implementations are considered.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
PublisherPubl by IEEE
Pages1366-1370
Number of pages5
ISBN (Print)0780312007
Publication statusPublished - 1993 Jan 1
Event1993 IEEE International Conference on Neural Networks - San Francisco, California, USA
Duration: 1993 Mar 281993 Apr 1

Publication series

Name1993 IEEE International Conference on Neural Networks

Other

Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, California, USA
Period93-03-2893-04-01

Fingerprint

Network performance
Data storage equipment

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Control and Systems Engineering
  • Software
  • Artificial Intelligence

Cite this

Chung, P-C., Chung, Y. N., & Tsai, C. T. (1993). Quantization effects of Hebbian-Type associative memories. In 1993 IEEE International Conference on Neural Networks (pp. 1366-1370). (1993 IEEE International Conference on Neural Networks). Publ by IEEE.
Chung, Pau-Choo ; Chung, Yi Nung ; Tsai, Ching Tsorng. / Quantization effects of Hebbian-Type associative memories. 1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. pp. 1366-1370 (1993 IEEE International Conference on Neural Networks).
@inproceedings{1f79b9f4818b46d19a8b7ead9a3b3de9,
title = "Quantization effects of Hebbian-Type associative memories",
abstract = "Effects of quantization strategies in Hebbian-Type associative memories are exploded in this paper. The quantization strategies considered include two-level, three-level strategy with a cut-off threshold, and linear quantizations. The two-level strategy is to clip positive interconnections into +1 and negative interconnections into -1. The three-level quantization uses the same strategy in turning the interconnections into +1 or -1. except that it is applied only to those interconnections having their values larger than a cut-off threshold. Those interconnections within the cutoff threshold are then set to zero. Results indicate that three-level quantization with a properly selected cut-off threshold gives network higher performance than two-level quantization. The performance of a network with linear quantization is also compared with a network with three-level quantization. It is also found that the linear quantization, although it preserves more network interconnections, does not significantly enhance network performance compared with the three-level quantization. Hence, it is concluded that the three-level binary quantization with an optimal threshold is a better choice when network implementations are considered.",
author = "Pau-Choo Chung and Chung, {Yi Nung} and Tsai, {Ching Tsorng}",
year = "1993",
month = "1",
day = "1",
language = "English",
isbn = "0780312007",
series = "1993 IEEE International Conference on Neural Networks",
publisher = "Publ by IEEE",
pages = "1366--1370",
booktitle = "1993 IEEE International Conference on Neural Networks",

}

Chung, P-C, Chung, YN & Tsai, CT 1993, Quantization effects of Hebbian-Type associative memories. in 1993 IEEE International Conference on Neural Networks. 1993 IEEE International Conference on Neural Networks, Publ by IEEE, pp. 1366-1370, 1993 IEEE International Conference on Neural Networks, San Francisco, California, USA, 93-03-28.

Quantization effects of Hebbian-Type associative memories. / Chung, Pau-Choo; Chung, Yi Nung; Tsai, Ching Tsorng.

1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. p. 1366-1370 (1993 IEEE International Conference on Neural Networks).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Quantization effects of Hebbian-Type associative memories

AU - Chung, Pau-Choo

AU - Chung, Yi Nung

AU - Tsai, Ching Tsorng

PY - 1993/1/1

Y1 - 1993/1/1

N2 - Effects of quantization strategies in Hebbian-Type associative memories are exploded in this paper. The quantization strategies considered include two-level, three-level strategy with a cut-off threshold, and linear quantizations. The two-level strategy is to clip positive interconnections into +1 and negative interconnections into -1. The three-level quantization uses the same strategy in turning the interconnections into +1 or -1. except that it is applied only to those interconnections having their values larger than a cut-off threshold. Those interconnections within the cutoff threshold are then set to zero. Results indicate that three-level quantization with a properly selected cut-off threshold gives network higher performance than two-level quantization. The performance of a network with linear quantization is also compared with a network with three-level quantization. It is also found that the linear quantization, although it preserves more network interconnections, does not significantly enhance network performance compared with the three-level quantization. Hence, it is concluded that the three-level binary quantization with an optimal threshold is a better choice when network implementations are considered.

AB - Effects of quantization strategies in Hebbian-Type associative memories are exploded in this paper. The quantization strategies considered include two-level, three-level strategy with a cut-off threshold, and linear quantizations. The two-level strategy is to clip positive interconnections into +1 and negative interconnections into -1. The three-level quantization uses the same strategy in turning the interconnections into +1 or -1. except that it is applied only to those interconnections having their values larger than a cut-off threshold. Those interconnections within the cutoff threshold are then set to zero. Results indicate that three-level quantization with a properly selected cut-off threshold gives network higher performance than two-level quantization. The performance of a network with linear quantization is also compared with a network with three-level quantization. It is also found that the linear quantization, although it preserves more network interconnections, does not significantly enhance network performance compared with the three-level quantization. Hence, it is concluded that the three-level binary quantization with an optimal threshold is a better choice when network implementations are considered.

UR - http://www.scopus.com/inward/record.url?scp=0027211266&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027211266&partnerID=8YFLogxK

M3 - Conference contribution

SN - 0780312007

T3 - 1993 IEEE International Conference on Neural Networks

SP - 1366

EP - 1370

BT - 1993 IEEE International Conference on Neural Networks

PB - Publ by IEEE

ER -

Chung P-C, Chung YN, Tsai CT. Quantization effects of Hebbian-Type associative memories. In 1993 IEEE International Conference on Neural Networks. Publ by IEEE. 1993. p. 1366-1370. (1993 IEEE International Conference on Neural Networks).