Quantization effects of Hebbian-Type associative memories

Pau Choo Chung, Yi Nung Chung, Ching Tsorng Tsai

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Effects of quantization strategies in Hebbian-Type associative memories are exploded in this paper. The quantization strategies considered include two-level, three-level strategy with a cut-off threshold, and linear quantizations. The two-level strategy is to clip positive interconnections into +1 and negative interconnections into -1. The three-level quantization uses the same strategy in turning the interconnections into +1 or -1. except that it is applied only to those interconnections having their values larger than a cut-off threshold. Those interconnections within the cutoff threshold are then set to zero. Results indicate that three-level quantization with a properly selected cut-off threshold gives network higher performance than two-level quantization. The performance of a network with linear quantization is also compared with a network with three-level quantization. It is also found that the linear quantization, although it preserves more network interconnections, does not significantly enhance network performance compared with the three-level quantization. Hence, it is concluded that the three-level binary quantization with an optimal threshold is a better choice when network implementations are considered.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
PublisherPubl by IEEE
Number of pages5
ISBN (Print)0780312007
Publication statusPublished - 1993
Event1993 IEEE International Conference on Neural Networks - San Francisco, California, USA
Duration: 1993 Mar 281993 Apr 1

Publication series

Name1993 IEEE International Conference on Neural Networks


Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, California, USA

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Control and Systems Engineering
  • Software
  • Artificial Intelligence


Dive into the research topics of 'Quantization effects of Hebbian-Type associative memories'. Together they form a unique fingerprint.

Cite this