Micro Darknet for Inference: ESL reference for inference accelerator design

Min Zhi Ji, Wei Chung Tseng, Ting Jia Wu, Bo Rong Lin, Chung Ho Chen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

For neural network (NN) models applying to low-end edge devices, the memory management is a very important issue because of the limitation of hardware resources. However, current NN frameworks typically allocate a huge memory space for NN models in the initial stage. To reduce memory requirements, we propose a lite NN inference-only framework, MDFI (Micro Darknet for Inference) based on Darknet. We optimize the MDFI C code by a layer-wise memory management and layer-dependency resolving mechanism. According to the experimental results, the average memory consumption of MDFI has 76% reduction compared to Darknet, and the average execution time of MDFI has 8% reduction also.

Original languageEnglish
Title of host publicationProceedings - 2019 International SoC Design Conference, ISOCC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages69-70
Number of pages2
ISBN (Electronic)9781728124780
DOIs
Publication statusPublished - 2019 Oct
Event16th International System-on-Chip Design Conference, ISOCC 2019 - Jeju, Korea, Republic of
Duration: 2019 Oct 62019 Oct 9

Publication series

NameProceedings - 2019 International SoC Design Conference, ISOCC 2019

Conference

Conference16th International System-on-Chip Design Conference, ISOCC 2019
CountryKorea, Republic of
CityJeju
Period19-10-0619-10-09

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering
  • Instrumentation
  • Artificial Intelligence
  • Hardware and Architecture

Fingerprint Dive into the research topics of 'Micro Darknet for Inference: ESL reference for inference accelerator design'. Together they form a unique fingerprint.

Cite this