Emulation of neural networks on a nanoscale architecture

Mary M. Eshaghian-Wilner, Aaron Friesz, Alex Khitun, Shiva Navab, Alice C. Parker, Kang L. Wang, Chongwu Zhou

研究成果: Article同行評審

10 引文 斯高帕斯(Scopus)

摘要

In this paper, we propose using a nanoscale spin-wave-based architecture for implementing neural networks. We show that this architecture can efficiently realize highly interconnected neural network models such as the Hopfield model. In our proposed architecture, no point-to-point interconnection is required, so unlike standard VLSI design, no fan-in/fan-out constraint limits the interconnectivity. Using spin-waves, each neuron could broadcast to all other neurons simultaneously and similarly a neuron could concurrently receive and process multiple data. Therefore in this architecture, the total weighted sum to each neuron can be computed by the sum of the values from all the incoming waves to that neuron. In addition, using the superposition property of waves, this computation can be done in O(1) time, and neurons can update their states quite rapidly.

原文English
文章編號058
頁(從 - 到)288-292
頁數5
期刊Journal of Physics: Conference Series
61
發行號1
DOIs
出版狀態Published - 2007 四月 1

All Science Journal Classification (ASJC) codes

  • 物理與天文學 (全部)

指紋

深入研究「Emulation of neural networks on a nanoscale architecture」主題。共同形成了獨特的指紋。

引用此