RAM: Exploiting Restrained and Approximate Management for Enabling Neural Network Training on NVM-based Systems

Chien Chung Ho, Wei Chen Wang, Szu Yu Chen, Yung Chun Li, Kun Chi Chiang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Training neural networks (NN) over conventional DRAM-based edge devices are highly restricted due to DRAM's leakage-power and limited-density properties. The non-volatile memory (NVM) reveals its potential of providing solutions with the high-density and nearly-zero leakage power features; however, it could lead to severe performance and lifetime issues. This paper focuses on exploring the NVM-aware neural network training design to mitigate performance and lifetime issues caused by the conventional NN approaches. Especially, we aim at exploiting the restrained-and-approximate managements to leverage insignificant data on performance and lifetime optimization. The encouraging results were observed with the comparable validation accuracy in the evaluations.

Original languageEnglish
Title of host publicationProceedings of the 37th ACM/SIGAPP Symposium on Applied Computing, SAC 2022
PublisherAssociation for Computing Machinery
Pages116-123
Number of pages8
ISBN (Electronic)9781450387132
DOIs
Publication statusPublished - 2022 Apr 25
Event37th ACM/SIGAPP Symposium on Applied Computing, SAC 2022 - Virtual, Online
Duration: 2022 Apr 252022 Apr 29

Publication series

NameProceedings of the ACM Symposium on Applied Computing

Conference

Conference37th ACM/SIGAPP Symposium on Applied Computing, SAC 2022
CityVirtual, Online
Period22-04-2522-04-29

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'RAM: Exploiting Restrained and Approximate Management for Enabling Neural Network Training on NVM-based Systems'. Together they form a unique fingerprint.

Cite this