Recently, it is found non-volatile memories (NVMs) offer opportunities for mitigating issues of neural network training on DRAM-based systems by taking advantage of its near-zero leakage power and high scalability properties. However, it brings the new challenges on energy consumption, lifetime and performance degradation caused by the massive weight/bias updates performed during training phases. To tackle these issues, this work proposes an approximate write-once memory (WOM) code method with considering the characteristics of weight updates and error tolerability of NNs. In particular, the proposed method aims to effectively reduce the number of writes on NVMs. The experimental results demonstrate that great enhancement on energy consumption, endurance and write performance can be simultaneously achieved without sacrificing the inference accuracy.