Design and implementation of a self-adaptive fuzzy inference engine

Jer-Min Jou, Pei-Yin Chen, Sheng Fu Yang

Research output: Contribution to conferencePaperpeer-review

2 Citations (Scopus)

Abstract

A pipelining fuzzy inference chip with a self-tunable knowledge base is presented in this paper. Up to 49 rules are inferred in parallel in the chip, and the memory size of its knowledge (rule) base is only 84 bytes since the memory-efficient and adjustable fuzzy rule format as well as the dynamic rule generating circuits are used. Based on these mechanism and a rule weight tuner, the possibility of narrowing, widening, moving, amplifying, and/or dampening the membership functions is provided in it, and makes the inference process self-adaptive. A three-stage pipeline in the parallel inference architecture let the chip very fast. It can yield an inference rate of 467K inferences/sec operating at a clock rate of 30 MHz.

Original languageEnglish
Pages1633-1640
Number of pages8
Publication statusPublished - 1995 Jan 1
EventProceedings of the 1995 IEEE International Conference on Fuzzy Systems. Part 1 (of 5) - Yokohama, Jpn
Duration: 1995 Mar 201995 Mar 24

Other

OtherProceedings of the 1995 IEEE International Conference on Fuzzy Systems. Part 1 (of 5)
CityYokohama, Jpn
Period95-03-2095-03-24

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Design and implementation of a self-adaptive fuzzy inference engine'. Together they form a unique fingerprint.

Cite this