AF-RCNN: An anchor-free convolutional neural network for multi-categories agricultural pest detection

L Jiao, S Dong, S Zhang, C Xie, H Wang - Computers and Electronics in …, 2020 - Elsevier
L Jiao, S Dong, S Zhang, C Xie, H Wang
Computers and Electronics in Agriculture, 2020Elsevier
The frequent outbreaks of agricultural pests have resulted in the reduction of crop production
and seriously restricted agricultural production. And many kinds of agricultural pests bring
challenges to the accurate identification of agricultural pests for agricultural workers.
Currently, the traditional methods of agricultural pest detection cannot satisfy the needs of
agricultural production because of low efficiency and accuracy. In this paper, we put forward
an anchor-free region convolutional neural network (AF-RCNN) for precision recognition …
Abstract
The frequent outbreaks of agricultural pests have resulted in the reduction of crop production and seriously restricted agricultural production. And many kinds of agricultural pests bring challenges to the accurate identification of agricultural pests for agricultural workers. Currently, the traditional methods of agricultural pest detection cannot satisfy the needs of agricultural production because of low efficiency and accuracy. In this paper, we put forward an anchor-free region convolutional neural network (AF-RCNN) for precision recognition and classification of 24-classes pests. First, a feature fusion module is designed to extract effective feature information of agricultural pests, especially small pests. Then, we propose an anchor-free region proposal network (AFRPN) that is used for getting high-quality object proposals as possible pest positions based on the fusion feature maps. Finally, our anchor-free region convolutional neural network (AF-RCNN) is employed to detect 24-classes pest via an end-to-end way by merging our AFRPN with Fast R-CNN into a single network. We evaluate the performance of our method on the pest dataset including 20k images and 24 classes. Experimental results demonstrate that our method is able to obtain 56.4% mAP and 85.1% mRecall on 24-classes pest dataset, 7.5% and 15.3% higher than Faster R-CNN, and 39.4% and 56.5% higher than YOLO detector. The running time could achieve 0.07 s per image, meeting the real-time detection. The proposed method is effective and applicable for accurate and real-time intelligent pest detection.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果