With analog current accumulation feature, resistive memory (ReRAM) crossbars are widely studied to accelerate neural network applications. The ReRAM crossbar based accelerators have many advantages over conventional CMOS-based accelerators, such as high performance and energy efficiency. However, due to the limited cell endurance, these accelerators suffer from short programming cycles when weights that stored in ReRAM cells are frequently updated during the neural network training phase. In this paper, by exploiting the wearing out mechanism of ReRAM cell, we propose a novel comprehensive framework, ReNEW, to enhance the lifetime of the ReRAM crossbar based accelerators, particularly for neural network training. Evaluation results show that, our proposed schemes reduce the total effective writes to ReRAM crossbar based accelerators by up to 500.3×, 50.0×, 2.83× and 1.60× over two MLC ReRAM crossbar baselines, one SLC ReRAM crossbar baseline and an SLC ReRAM crossbar design with optimal timing, respectively.