For neural network (NN) models applying to low-end edge devices, the memory management is a very important issue because of the limitation of hardware resources. However, current NN frameworks typically allocate a huge memory space for NN models in the initial stage. To reduce memory requirements, we propose a lite NN inference-only framework, MDFI (Micro Darknet for Inference) based on Darknet. We optimize the MDFI C code by a layer-wise memory management and layer-dependency resolving mechanism. According to the experimental results, the average memory consumption of MDFI has 76% reduction compared to Darknet, and the average execution time of MDFI has 8% reduction also.