Low-complexity neuron for fixed-point artificial neural networks with ReLU activation function in energy-constrained wireless applications

Wen Long Chin, Qinyu Zhang, Tao Jiang

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

This work introduces an efficient neuron design for fixed-point artificial neural networks with the rectified linear unit (ReLU) activation function for energy-constrained wireless applications. The fixed-point binary numbers and ReLU activation function are used in most application-specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy. Consequently, many practitioners and researchers are discovering the ways to reduce implementation complexity of ANNs, particularly for battery-powered wireless applications. For this, a low-complexity neuron to predict the sign bit of the input of the non-linear activation function, ReLU, by employing the saturation characteristics of the activation function is proposed. According to our simulation results based on random data, computation overhead of a neuron using the proposed technique can be saved by a ratio of 29.6% compared to the conventional neuron using a word length of 8 bits without apparently increasing the prediction error. A comparison of the proposed algorithm with the popular 16-bit fixed-point format of the convolutional network, AlexNet, indicates that the computation can be saved by 48.58% as well.

Original languageEnglish
Pages (from-to)917-923
Number of pages7
JournalIET Communications
Volume15
Issue number7
DOIs
Publication statusPublished - 2021 Apr

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Low-complexity neuron for fixed-point artificial neural networks with ReLU activation function in energy-constrained wireless applications'. Together they form a unique fingerprint.

Cite this