Linear quantization of Hebbian-type associative memories in interconnection implementations

Pau Choo Chung, Ching Tsorng Tsai, Yung Nien Sun

Research output: Contribution to conferencePaperpeer-review

3 Citations (Scopus)

Abstract

The effects of linear quantized Hebbian-type associative memories (HAMs) on storage capacity and hardware implementations are explored in this paper. For the linear quantization, the interconnection weights are linearly quantized into a small number of levels. This consideration focuses mainly on the situation when only a limited accuracy range can be achieved on hardware implementations. Results of simulation and theory show that the number of quantization levels required is relatively small compared with the possible values of interconnections. Therefore, linear quantization in HAMs is worthwhile in hardware implementations.

Original languageEnglish
Pages1092-1096
Number of pages5
Publication statusPublished - 1994 Dec 1
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: 1994 Jun 271994 Jun 29

Other

OtherProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period94-06-2794-06-29

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint Dive into the research topics of 'Linear quantization of Hebbian-type associative memories in interconnection implementations'. Together they form a unique fingerprint.

Cite this