TY - JOUR
T1 - Low-complexity neuron for fixed-point artificial neural networks with ReLU activation function in energy-constrained wireless applications
AU - Chin, Wen Long
AU - Zhang, Qinyu
AU - Jiang, Tao
N1 - Funding Information:
The authors would like to thank the editor and reviewers for their helpful comments in improving the quality of this paper. This work was supported in part by the grant MOST 109‐2221‐E‐006‐181, Taiwan.
Publisher Copyright:
© 2021 The Authors. IET Communications published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology
PY - 2021/4
Y1 - 2021/4
N2 - This work introduces an efficient neuron design for fixed-point artificial neural networks with the rectified linear unit (ReLU) activation function for energy-constrained wireless applications. The fixed-point binary numbers and ReLU activation function are used in most application-specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy. Consequently, many practitioners and researchers are discovering the ways to reduce implementation complexity of ANNs, particularly for battery-powered wireless applications. For this, a low-complexity neuron to predict the sign bit of the input of the non-linear activation function, ReLU, by employing the saturation characteristics of the activation function is proposed. According to our simulation results based on random data, computation overhead of a neuron using the proposed technique can be saved by a ratio of 29.6% compared to the conventional neuron using a word length of 8 bits without apparently increasing the prediction error. A comparison of the proposed algorithm with the popular 16-bit fixed-point format of the convolutional network, AlexNet, indicates that the computation can be saved by 48.58% as well.
AB - This work introduces an efficient neuron design for fixed-point artificial neural networks with the rectified linear unit (ReLU) activation function for energy-constrained wireless applications. The fixed-point binary numbers and ReLU activation function are used in most application-specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy. Consequently, many practitioners and researchers are discovering the ways to reduce implementation complexity of ANNs, particularly for battery-powered wireless applications. For this, a low-complexity neuron to predict the sign bit of the input of the non-linear activation function, ReLU, by employing the saturation characteristics of the activation function is proposed. According to our simulation results based on random data, computation overhead of a neuron using the proposed technique can be saved by a ratio of 29.6% compared to the conventional neuron using a word length of 8 bits without apparently increasing the prediction error. A comparison of the proposed algorithm with the popular 16-bit fixed-point format of the convolutional network, AlexNet, indicates that the computation can be saved by 48.58% as well.
UR - http://www.scopus.com/inward/record.url?scp=85101856418&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101856418&partnerID=8YFLogxK
U2 - 10.1049/cmu2.12129
DO - 10.1049/cmu2.12129
M3 - Article
AN - SCOPUS:85101856418
SN - 1751-8628
VL - 15
SP - 917
EP - 923
JO - IET Communications
JF - IET Communications
IS - 7
ER -