Micro Darknet for Inference: ESL reference for inference accelerator design

Min Zhi Ji, Wei Chung Tseng, Ting Jia Wu, Bo Rong Lin, Chung Ho Chen

研究成果: Conference contribution

1 引文 斯高帕斯(Scopus)

摘要

For neural network (NN) models applying to low-end edge devices, the memory management is a very important issue because of the limitation of hardware resources. However, current NN frameworks typically allocate a huge memory space for NN models in the initial stage. To reduce memory requirements, we propose a lite NN inference-only framework, MDFI (Micro Darknet for Inference) based on Darknet. We optimize the MDFI C code by a layer-wise memory management and layer-dependency resolving mechanism. According to the experimental results, the average memory consumption of MDFI has 76% reduction compared to Darknet, and the average execution time of MDFI has 8% reduction also.

原文English
主出版物標題Proceedings - 2019 International SoC Design Conference, ISOCC 2019
發行者Institute of Electrical and Electronics Engineers Inc.
頁面69-70
頁數2
ISBN(電子)9781728124780
DOIs
出版狀態Published - 2019 10月
事件16th International System-on-Chip Design Conference, ISOCC 2019 - Jeju, Korea, Republic of
持續時間: 2019 10月 62019 10月 9

出版系列

名字Proceedings - 2019 International SoC Design Conference, ISOCC 2019

Conference

Conference16th International System-on-Chip Design Conference, ISOCC 2019
國家/地區Korea, Republic of
城市Jeju
期間19-10-0619-10-09

All Science Journal Classification (ASJC) codes

  • 訊號處理
  • 電氣與電子工程
  • 儀器
  • 人工智慧
  • 硬體和架構

指紋

深入研究「Micro Darknet for Inference: ESL reference for inference accelerator design」主題。共同形成了獨特的指紋。

引用此