Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Accurate segmentation and detection of rice seedlings is essential for precision agriculture and high-yield cultivation. However, current methods suffer from high computational complexity and poor robustness to different rice varieties and densities. This article proposes 2 lightweight neural network architectures, LW-Segnet and LW-Unet, for high-precision rice seedling segmentation. The networks adopt an encoder–decoder structure with hybrid lightweight convolutions and spatial pyramid dilated convolutions, achieving accurate segmentation while reducing model parameters. Multispectral imagery acquired by unmanned aerial vehicle (UAV) was used to train and test the models covering 3 rice varieties and different planting densities. Experimental results demonstrate that the proposed LW-Segnet and LW-Unet models achieve higher F1-scores and intersection over union values for seedling detection and row segmentation across varieties, indicating improved segmentation accuracy. Furthermore, the models exhibit stable performance when handling different varieties and densities, showing strong robustness. In terms of efficiency, the networks have lower graphics processing unit memory usage, complexity, and parameters but faster inference speeds, reflecting higher computational efficiency. In particular, the fast speed of LW-Unet indicates potential for real-time applications. The study presents lightweight yet effective neural network architectures for agricultural tasks. By handling multiple rice varieties and densities with high accuracy, efficiency, and robustness, the models show promise for use in edge devices and UAVs to assist precision farming and crop management. The findings provide valuable insights into designing lightweight deep learning models to tackle complex agricultural problems.
Wang J, Sun X, Xu Y, Wang Q, Tang H, Zhou W. The effect of harvest date on yield loss of long and short-grain rice cultivars (Oryza sativa L.) in Northeast China. Eur J Agron. 2021;131(8):126382.
Lourenço P, Godinho S, Sousa A, Gonçalves AC. Estimating tree aboveground biomass using multispectral satellite-based data in Mediterranean agroforestry system using random forest algorithm. Remote Sens Appl: Soc Environ. 2021;23:100560.
Yu Y, Bao Y, Wang J, Chu H, Zhao N, He Y, Liu Y. Crop row segmentation and detection in Paddy fields based on treble-classification Otsu and double-dimensional clustering method. Remote Sens. 2021;13(5):901.
Wu J, Yang G, Yang X, Xu B, Han L, Zhu Y. Automatic counting of in situ rice seedlings from UAV images based on a deep fully convolutional neural network. Remote Sens. 2019;11(6):691–710.
Varela S, Dhodda P, Hsu W, Prasad PVV, Id Y, Assefa Y, Peralta N, Griffin T, Sharda A, Ferguson A, et al. Early-season stand count determination in corn via integration of imagery from unmanned aerial systems (UAS) and supervised learning techniques. Remote Sens. 2018;10(2):10020343.
Comba L, Biglia A, Aimonino DR, Tortia C, Gay P. Leaf area index evaluation in vineyards using 3D point clouds from UAV imagery. Precis Agric. 2020;21:881–896.
Mza B, Jz B, Kas C, Nrk C. Estimation of maize yield and effects of variable-rate nitrogen application using UAV-based RGB imagery. Biosyst Eng. 2020;189:24–35.
Jibo Y, Feng H, Li Z, Zhou C, Xu K. Mapping winter-wheat biomass and grain yield based on a crop model and UAV remote sensing. Int J Remote Sens. 2020;42:1–25.
Zhong L, Hu L, Zhou H. Deep learning based multi-temporal crop classification. Remote Sens Environ. 2019;221:430–443.
Kitano BT, Mendes C, Geus AR, Oliveira HC, Souza JR. Corn plant counting using deep learning and UAV images. IEEE Geosci Remote Sens Lett. 2019;PP(99):1–5.
Varela S, Dhodda PR, Hsu WH, Prasad PVV, Assefa Y, Peralta NR, Griffin T, Sharda A, Ferguson A, Ciampitti IA. Early-season stand count determination in corn via integration of imagery from unmanned aerial systems (UAS) and supervised learning techniques. Remote Sens. 2018;10(2):343.
Su W, Jiang K, Yan A, Liu Z, Zhang M, Wang W. Monitoring of planted lines for breeding corn using UAV remote sensing image. Nongye Gongc Xuebao/Trans Chin Soc Agric Eng. 2018;34(5):92–98.
Li D, Li B, Long S, Feng H, Xi T, Kang S, Wang JJBE. Rice seedling row detection based on morphological anchor points of rice stems. Biosyst Eng. 2023;226:71–85.
Li D, Li B, Kang S, Feng H, Long S. E2CropDet: An efficient end-to-end solution to crop row detection. Expert Syst Appl. 2023;227:120345.
Pang Y, Shi Y, Gao S, Jiang F, Sivakumar ANV, Thompson L, Luck J, Liu C. Improved crop row detection with deep neural network for early-season maize stand count in UAV imagery. Comput Electron Agric. 2020;178:Article 105766.
Wu J, Guijun Y, Han L, Zhu Y. Automatic counting of in situ rice seedlings from UAV images based on a deep fully convolutional neural network. Remote Sens. 2019;11(6):11060691.
Miyoshi G, Osco L, Junior J, Gonçalves D, Imai N, Tommaselli A, Honkavaara E, Gonçalves W, dos Santos M. A novel deep learning method to identify single tree species in UAV-based hyperspectral images. Remote Sens. 2020;12(8):12081294.
Osco L, dos Santos de Arruda M, Gonçalves D, Dias A, Batistoti J, de Souza M, Gomes F, Ramos AP, Jorge L, Liesenberg V, et al. A CNN approach to simultaneously count plants and detect plantation-rows from UAV imagery. ISPRS J Photogrrammetry Remote Sens. 2021;174:1–17.
Zhang E, Liu L, Huang L. An automated, generalized, deep-learning-based method for delineating the calving fronts of Greenland glaciers from multi-sensor remote sensing imagery. Remote Sens Environ. 2021;254:112265.
Wang J, Sun X, Xu Y, Zhou W, Tang H, Wang Q. Timeliness harvesting loss of rice in cold region under different mechanical harvesting methods. Sustainability. 2021;13(11):6345.
Zhuohuai G, Keyin C, Youchun D, Chongyou W. Visual navigation path extraction method in rice harvesting. Trans Chin Soc Agric Machinery. 2020;51(1):1.
Kanagasingham S, Ekpanyapong M, Chaihan RJPA. Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precis Agric. 2020;21(4):831–855.
Adhikari SP, Kim G, Kim HJIA. Deep neural network-based system for autonomous navigation in paddy field. IEEE Access. 2020;8(99):71272–71278.
Zhang Q, Chen MS, Li BJC. A visual navigation algorithm for paddy field weeding robot based on image understanding. Comput Electron Agric. 2017;143(20):66–78.
He HS, Hao Z, Mladenoff DJ, Shao G, Hu Y, Chang Y. Chang, simulating forest ecosystem response to climate warming incorporating spatial effects in North-Eastern China. J Biogeogr. 2005;32(12):2043–2056.
Janiesch C, Zschech P, Heinrich KJEM. Machine learning and deep learning. Biomed Sig Process. 2021;31(3):685–695.
Agarwal M, Gupta SK, Biswas K. Development of a compressed FCN architecture for semantic segmentation using particle swarm optimization. Neural Comput Appl. 2023;35(16):11833–11846.
Gong H, Liu T, Luo T, Guo J, Feng R, Li J, Ma X, Mu Y, Hu T, Sun YJA, et al. Based on FCN and DenseNet framework for the research of rice pest identification methods. Agronomy. 2023;13(2):410.
Loebel E, Scheinert M, Horwath M, Humbert A, Sohn J, Heidler K, Liebezeit C. Calving front monitoring at sub-seasonal resolution: A deep learning application to Greenland glaciers. Cryosphere Discuss. 2023;2023:1–21.
Oršić M, Šegvić S. Efficient semantic segmentation with pyramidal fusion. Pattern Recogn. 2021;110:107611.
Kang J, Liu L, Zhang F, Shen C, Wang N, Shao LJC. Semantic segmentation model of cotton roots in-situ image based on attention mechanism. Comput Electron Agric. 2021;189: Article 106370.
Li R, Zheng S, Zhang C, Duan C, Su J, Wang L. Multiattention network for semantic segmentation of fine-resolution remote sensing images. IEEE Trans Geosci Remote Sens. 2021;60(99):1–13.
Su Y, Cheng J, Bai H, Liu H, He CJRS. Semantic segmentation of very-high-resolution remote sensing images via deep multi-feature learning. Remote Sens. 2022;14(3):533.
Kim S, Park S, Kwak J, Choi IJEI. Semantic segmentation of seagrass habitat from drone imagery based on deep learning: A comparative study. Eco Inform. 2021;66:101430.
Zhu D, Qian C, Qu C, He M, Zhang S, Tu Q. An improved SegNet network model for accurate detection and segmentation of car body welding slags. Int J Adv Manuf Technol. 2022;120(1-2):1095–1105.
Liu F, Wang LJC, Materials B. UNet-based model for crack detection integrating visual explanations. Constr Build Mater. 2022;322:126265.
Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).