Defect tolerant implementations of feed-forward and recurrent neural networks

Paul Franzon, David Van den Bout, John Paulos, Thomas Miller, Wesley Snyder, Troy Nagle, Wentai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The issues involved in the defect tolerant, large-scale implementation of two basic architectural classes of artificial neural networks--feed-forward and recurrent--are considered. The structures can be reconfigured to enable differently sized and connected neural structures to be implemented on the same piece of hardware. With the addition of suitable test measures, the techniques that give scalable and flexible neural networks also give defect tolerance. Thus the advantages of wafer-scale integration (WSI) can be readily applied to these structures. By applying WSI techniques, scalability is enhanced, as the requirement for the network to be partitioned between chips can be greatly relaxed. Thus, richly connected networks can be constructed.

Original languageEnglish
Title of host publication1990 Proc Int Conf Wafer Scale Integr
EditorsJoe Brewer, Michael J. Little
PublisherPubl by IEEE
Pages160-166
Number of pages7
ISBN (Print)0818690135
Publication statusPublished - 1990
Event1990 Proceedings - International Conference on Wafer Scale Integration - San Francisco, CA, USA
Duration: 1990 Jan 231990 Jan 25

Publication series

Name1990 Proc Int Conf Wafer Scale Integr

Conference

Conference1990 Proceedings - International Conference on Wafer Scale Integration
CitySan Francisco, CA, USA
Period90-01-2390-01-25

All Science Journal Classification (ASJC) codes

  • General Engineering

Fingerprint

Dive into the research topics of 'Defect tolerant implementations of feed-forward and recurrent neural networks'. Together they form a unique fingerprint.

Cite this