Design and implementation of a self-adaptive fuzzy inference engine

Jer-Min Jou, Pei-Yin Chen, Sheng Fu Yang

研究成果: Paper同行評審

2 引文 斯高帕斯(Scopus)

摘要

A pipelining fuzzy inference chip with a self-tunable knowledge base is presented in this paper. Up to 49 rules are inferred in parallel in the chip, and the memory size of its knowledge (rule) base is only 84 bytes since the memory-efficient and adjustable fuzzy rule format as well as the dynamic rule generating circuits are used. Based on these mechanism and a rule weight tuner, the possibility of narrowing, widening, moving, amplifying, and/or dampening the membership functions is provided in it, and makes the inference process self-adaptive. A three-stage pipeline in the parallel inference architecture let the chip very fast. It can yield an inference rate of 467K inferences/sec operating at a clock rate of 30 MHz.

原文English
頁面1633-1640
頁數8
出版狀態Published - 1995 1月 1
事件Proceedings of the 1995 IEEE International Conference on Fuzzy Systems. Part 1 (of 5) - Yokohama, Jpn
持續時間: 1995 3月 201995 3月 24

Other

OtherProceedings of the 1995 IEEE International Conference on Fuzzy Systems. Part 1 (of 5)
城市Yokohama, Jpn
期間95-03-2095-03-24

All Science Journal Classification (ASJC) codes

  • 軟體
  • 理論電腦科學
  • 人工智慧
  • 應用數學

指紋

深入研究「Design and implementation of a self-adaptive fuzzy inference engine」主題。共同形成了獨特的指紋。

引用此