Development of a Novel Classification technique for detection of Cowpea Leaves using VGG16 Deep Convolutional Network
Abstract
This paper is based on CNN-VGG16 network architecture in order to identify the cowpea plants from weed in the cowpea fields. It is done to increase the productivity of the crops. The VGG16 is a basic network of Convolutional Neural Network family and it has quite good classification performance. Implement and modification is quite easy with this model. Here, we have applied pre-trained VGG16 method to the cowpea as well as weed leaves custom dataset. Images of leaves have been taken from standard research farm, ICAR-NBPGR, Pusa campus, New Delhi. The dataset includes 1230 images of two classes. The basic principle of VGG16 network for automatic detection includes, feature extraction from input image, addition of convo layer, pooling layer addition, fully connected layer, and sigmoid for classifiers. Results from experiments show that the model efficiently classifies the cowpea and weed images. Here, accuracy rate on the training dataset is 97.58%, and that of test dataset set is 90.08%. Such results give hope for achieving agricultural reforms.
Downloads
References
Architecture of VGG 16 CNN Model of Achievements of ILSVRC (ImageNet Large Scale Visual Recognition Challenge) competitions and their neural network depth from Oxford
Dyrmann, M.; Karstoft, H.; Midtiby, H. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80
Yu, J.; Sharpe, S.; Schumann, A.; Boyd, N. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84
Olsen, A.; Konovalov, D.; Philippa, B.; Ridd, P.; Wood, J.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. Deepweeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 2058
Potena, C.; Nardi, D.; Pretto, A. Fast and accurate crop and weed identification with summarized train sets for precision agriculture. Intelligent Autonomous Systems 14. IAS 2016. Adv. Intell. Systems Comput. 2017, 531, 105–121.
Beeharry, Y.; Bassoo, V. Performance of ANN and AlexNet for weed detection using UAV-based images. In Proceedings of the 2020 3rd International Conference on Emerging Trends in Electrical, Electronic and Communications Engineering (ELECOM), Balaclava, Mauritius, 25–27 November 2020; pp. 163–167
Ramirez, W.; Achanccaray, P.; Mendoza, L.; Pacheco, M. Deep Convolutional Neural Networks for Weed Detection in Agricultural Crops using Optical Aerial Images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 133–137
Patidar, S.; Singh, U.; Sharma, S.; Himanshu. Weed Seedling Detection Using Mask Regional Convolutional Neural Network. In Proceedings of the 2020 International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 2–4 July 2020; pp. 311–316
You, J.; Liu, W.; Lee, J. A DNN-based semantic segmentation for detecting weed and crop. Comput. Electron. Agric. 2020, 178, 105750.
Ferreira, A.; Freitas, D.; Silva, G.; Pistori, H.; Folhes, M. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324
Peteinatos, G.; Reichel, P.; Karouta, J.; Andújar, D.; Gerhards, R. Weed Identification in Maize, Sunflower, and Potatoes with the Aid of Convolutional Neural Networks. Remote Sens. 2020, 12, 4185.
Asad, M.; Bais, A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Inf. Process. Agric. 2020, 7, 535–545.
Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23.
Suh, H.; IJsselmuiden, J.; Hofstee, J.; Henten, E. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65.
Chechli ´nski, Ł.; Siemi ˛atkowska, B.; Majewski, M. A System for Weeds and Crops Identification—Reaching over 10 FPS on Raspberry Pi with the Usage of MobileNets, DenseNet and Custom Modifications. Sensors 2019, 19, 3787.
Huang, H.; Lan, Y.; Deng, J.; Yang, A.; Deng, X.; Zhang, L.; Wen, S. A semantic labeling approach for accurate weed mapping of high-resolution UAV Imagery. Sensors 2018, 18, 2113.
Peng, C.; Li, Y.; Jiao, L.; Chen, Y.; Shang, R. Densely Based Multi-Scale and Multi-Modal Fully Convolutional Networks for High-Resolution Remote-Sensing Image Semantic Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2612–2626.
Chen, F.; Wang, C.; Gu, M.; Zhao, Y. Spruce Image Segmentation Algorithm Based on Fully Convolutional Networks. Trans. Chin. Soc. Agric. Mach. 2018, 49, 188–194+210
Dyrmann, M.; Jørgensen, R.; Midtiby, H. RoboWeedSupport-Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Adv. Anim. Biosci. 2017, 8, 842–847
Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Wen, S.; Zhang, H.; Zhang, Y. Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery. Sensors 2018, 18, 3299
Ma, X.; Deng, X.; Qi, L.; Jiang, Y.; Li, H.; Wang, Y.; Xing, X. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLoS ONE 2019, 14, e0215676
Fu, X.; Qu, H. Research on Semantic Segmentation of High-resolution Remote Sensing Image Based on Full Convolutional Neural Network. In Proceedings of the 2018 12th International Symposium on Antennas, Propagation and EM Theory (ISAPE), Hangzhou, China, 3–6 December 2018; pp. 1–4.
Copyright (c) 2021 Vijaya Choudhary, Paramita Guha, Kuldeep Tripathi, Sunita Mishra
This work is licensed under a Creative Commons Attribution 4.0 International License.
I/We agree with the provision of the Bye-Law 118 of The Institution of Engineers (India) which states that copyright of each paper published in Institution Journal or Annual Technical Volume in full or in Abstract at its centres shall lie with the Institution.