Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Automatically segmenting crops and weeds in the image input from cameras accurately is essential in various agricultural technology fields, such as herbicide spraying by farming robots based on crop and weed segmentation information. However, crop and weed images taken with a camera have motion blur due to various causes (e.g., vibration or shaking of a camera on farming robots, shaking of crops and weeds), which reduces the accuracy of crop and weed segmentation. Therefore, robust crop and weed segmentation for motion-blurred images is essential. However, previous crop and weed segmentation studies were performed without considering motion-blurred images. To solve this problem, this study proposed a new motion-blur image restoration method based on a wide receptive field attention network (WRA-Net), based on which we investigated improving crop and weed segmentation accuracy in motion-blurred images. WRA-Net comprises a main block called a lite wide receptive field attention residual block, which comprises modified depthwise separable convolutional blocks, an attention gate, and a learnable skip connection. We conducted experiments using the proposed method with 3 open databases: BoniRob, crop/weed field image, and rice seedling and weed datasets. According to the results, the crop and weed segmentation accuracy based on mean intersection over union was 0.7444, 0.7741, and 0.7149, respectively, demonstrating that this method outperformed the state-of-the-art methods.
Jiang Y, Li C. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenomics. 2020;2020:4152816.
Li D, Li J, Xiang S, Pan A. PSegNet: Simultaneous semantic and instance segmentation for point clouds of plants. Plant Phenomics. 2022;2022:9787643.
Rawat S, Chandra AL, Desai SV, Balasubramanian VN, Ninomiya S, Guo W. How useful is image-based active learning for plant organ segmentation? Plant Phenomics. 2022;2022:9795275.
Wang Z, Zhang Z, Lu Y, Luo R, Niu Y, Yang X, Jing S, Ruan C, Zheng Y, Jia W. SE-COTR: A novel fruit segmentation model for green apples application in complex orchard. Plant Phenomics. 2022;2022:0005.
McLachlan GJ. Mahalanobis distance. Resonance. 1999;4:20–26.
Lottes P, Hörferlin M, Sander S, Stachniss C. Effective vision-based classification for separating sugar beets and weeds for precision farming: Effective vision-based classification. J Field Robot. 2017;34(6):1160–1178.
Zheng Y, Zhu Q, Huang M, Guo Y, Qin J. Maize and weed classification using color indices with support vector data description in outdoor fields. Comput Electron Agric. 2017;141:215–222.
Wu X, Xu W, Song Y, Cai M. A detection method of weed in wheat field on machine vision. Procedia Eng. 2011;15:1998–2003.
Tax DMJ, Duin RPW. Support vector domain description. Pattern Recogn Lett. 1999;20(11-13):1191–1199.
Badrinarayanan V, Kendall A, Cipolla R. SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans Pattern Anal Mach Intell. 2017;39(12):2481–2495.
Khan A, Ilyas T, Umraiz M, Mannan ZI, Kim H. CED-Net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics. 2020;9(10):1602.
You J, Liu W, Lee J. A DNN-based semantic segmentation for detecting weed and crop. Comput Electron Agric. 2020;178:105750.
Kim YH, Park KR. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds. Comput Electron Agric. 2022;199:107146.
Zhao H, Gallo O, Frosio I, Kautz J. Loss functions for image restoration with neural networks. IEEE Trans Comput Imaging. 2017;3(1):47–57.
Wang Z, Bovik AC, Sheikh HR, Simoncelli EP. Image quality assessment: From error visibility to structural similarity. IEEE Trans Image Process. 2004;13(4):600–612.
Chebrolu N, Lottes P, Schaefer A, Winterhalter W, Burgard W, Stachniss C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int J Robot Res. 2017;36(10):1045–1052.
Ma X, Deng X, Qi L, Jiang Y, Li H, Wang Y, Xing X. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLOS ONE. 2019;14(4):e0215676.
Vinogradova K, Dibrov A, Myers G. Towards interpretable semantic segmentation via gradient-weighted class activation mapping (student abstract). Proc AAAI Conf Artif Intell. 2020;34(10):13943–13944.
Distributed under a Creative Commons Attribution License (CC BY 4.0).