AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Rice Plant Counting, Locating, and Sizing Method Based on High-Throughput UAV RGB Images

Xiaodong Bai1Pichao Liu2( )Zhiguo Cao3Hao Lu3Haipeng Xiong4Aiping Yang5Zhe Cai5Jianjun Wang5Jianguo Yao2
School of Computer Science and Technology, Hainan University, Haikou 570228, China
School of Telecommunication and Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210003, China
School of Artificial Intelligence and Automation, Huazhong University of Science and Technology, Wuhan 430074, China
School of Computing, National University of Singapore, Singapore 119077, Singapore
Agricultural Meteorological Center, Jiangxi Meteorological Bureau, Nanchang 330045, China
Show Author Information

Abstract

Rice plant counting is crucial for many applications in rice production, such as yield estimation, growth diagnosis, disaster loss assessment, etc. Currently, rice counting still heavily relies on tedious and time-consuming manual operation. To alleviate the workload of rice counting, we employed an UAV (unmanned aerial vehicle) to collect the RGB images of the paddy field. Then, we proposed a new rice plant counting, locating, and sizing method (RiceNet), which consists of one feature extractor frontend and 3 feature decoder modules, namely, density map estimator, plant location detector, and plant size estimator. In RiceNet, rice plant attention mechanism and positive–negative loss are designed to improve the ability to distinguish plants from background and the quality of the estimated density maps. To verify the validity of our method, we propose a new UAV-based rice counting dataset, which contains 355 images and 257,793 manual labeled points. Experiment results show that the mean absolute error and root mean square error of the proposed RiceNet are 8.6 and 11.2, respectively. Moreover, we validated the performance of our method with two other popular crop datasets. On these three datasets, our method significantly outperforms state-of-the-art methods. Results suggest that RiceNet can accurately and efficiently estimate the number of rice plants and replace the traditional manual method.

References

1

Shaheen SM, Antoniadis V, Shahid M, Yang Y, Abdelrahman H, Zhang T, Hassan NEE, Bibi I, Niazi NK, Younis SA, et al. Sustainable applications of rice feedstock in agro-environmental and construction sectors: A global perspective. Renew Sust Energ Rev. 2022;153:111791.

2

Chauhan BS, Abugho SB. Effects of water regime, nitrogen fertilization, and rice plant density on growth and reproduction of lowland weed Echinochloa crus-galli. Crop Prot. 2013;54:142–147.

3

Zheng H, Chen Y, Chen Q, Li B, Zhang Y, Jia W, Mo W, Tang Q. High-density planting with lower nitrogen application increased early rice production in a double-season rice system. Agron J. 2020;112:205–214.

4

Blanc E, Strobl E. Assessing the impact of typhoons on rice production in the Philippines. J Appl Meteorol Climatol. 2016;55:993–1007.

5

Liu L, Lu H, Li Y, Cao Z. High-throughput Rice density estimation from transplantation to Tillering stages using deep networks. Plant Phenomics. 2020;2020:1375957.

6

Madec S, Jin X, Lu H, De Solan B, Liu S, Duyme F, Heritier H, Baret F. Ear density estimation from high resolution RGB imagery using deep learning technique. Agric For Meteorol. 2019;264:225–234.

7

Varela S, Dhodda PR, Hsu WH, Prasad P, Assefa Y, Peralta NR, Griffin T, Sharda A, Ferguson A, Ciampitti I. Early-season stand count determination in corn via integration of imagery from unmanned aerial systems (UAS) and supervised learning techniques. Remote Sens. 2018;10:343.

8
J. G. A. Barbedo, Method for automatic counting root nodules using digital images, Paper presented at: Proceedings of the 2012 12th International Conference on Computational Science and Its Applications; 2012 June 18–21; Salvador, Brazil; pp. 159-161.
9

Rustia DJA, Lin CE, Chung J-Y, Zhuang Y-J, Hsu J-C, Lin T-T. Application of an image and environmental sensor network for automated greenhouse insect pest monitoring. J Asia Pac Entomol. 2020;23:17–28.

10
Jin X, Madec S, Dutartre D, de Solan B, Comar A, Baret F. High-throughput measurements of stem characteristics to estimate ear density and above-ground biomass. Plant Phenomics.. 2019;2019:4820305.
11

Fernandez-Gallego JA, Buchaillot M, Aparicio Gutiérrez N, Nieto-Taladriz MT, Araus JL, Kefauver SC. Automatic wheat ear counting using thermal imagery. Remote Sens. 2019;11:751.

12

Duan L, Huang C, Chen G, Xiong L, Liu Q, Yang W. Determination of rice panicle numbers during heading by multi-angle imaging. Crop J. 2015;3:211–219.

13

Bai X, Cao Z, Zhao L, Zhang J, Lv C, Li C, Xie J. Rice heading stage automatic observation by multi-classifier cascade based rice spike detection method. Agric For Meteorol. 2018;259:260–270.

14

Tan S, Ma X, Mai Z, Qi L, Wang Y. Segmentation and counting algorithm for touching hybrid rice grains. Comput Electron Agric. 2019;162:493–504.

15

Hasan MM, Chopin JP, Laga H, Miklavcic SJ. Detection and analysis of wheat spikes using convolutional neural networks. Plant Methods. 2018;14:100.

16

Lu H, Cao Z, Xiao Y, Zhuang B, Shen C. TasselNet: Counting maize tassels in the wild via local counts regression network. Plant Methods. 2017;13:79.

17
Li Y, Zhang X, Chen D. CSRNet: Dilated convolutional neural networks for understanding the highly congested scenes. Paper presented at: Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2018; pp. 1091–1100.
18
C. Zhang, H. Li, X. Wang, X. Yang, Cross-scene crowd counting via deep convolutional neural networks. Paper presented at: Proceedings of the 2015 IEEE conference on computer vision and pattern recognition (CVPR); 2015 June 7–12; Boston, MA; pp. 833-841.
19

Wang Q, Gao J, Lin W, Li X. NWPU-crowd: A large-scale benchmark for crowd counting and localization. IEEE Trans Pattern Anal Mach Intell. 2020;43:2141–2149.

20

Zhao Q, Xiao J, Wang Z, Ma X, Wang M, Satoh S. Vehicle counting in very low-resolution aerial images via cross-resolution spatial consistency and Intraresolution time continuity. IEEE Trans Geosci Remote Sens. 2022;60:4706813.

21
Erturk IF, Alper Demir M, Akar Ve G, Kulah H. Automatic Cell Counting From Microchannel Images. Paper presented at: Proceedings of the 30th Signal Processing and Communications Applications Conference, SIU 2022; 2022 May 15–18; Safranbolu, Turkey.
22

Selinummi J, Seppälä J, Yli-Harja O, Puhakka JA. Software for quantification of labeled bacteria from digital microscope images by automated image analysis. BioTechniques. 2005;39:859–863.

23

Dollár P, Wojek C, Schiele B, Perona P. Pedestrian detection: An evaluation of the state of the art. IEEE Trans Pattern Anal Mach Intell. 2011;34:743–761.

24

Viola P, Jones MJ. Robust real-time face detection. Int J Comput Vis. 2004;57:137–154.

25
T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, Microsoft coco: Common objects in context. In: Dollár P, Zitnick CL. In, Fleet D, Pajdla T, Schiele B, Tuytelaars T, editors. European conference on computer vision; Springer, Cham; 2014; pp. 740-755.
26

Wang X, Chen J, Wang Z, Liu W, Satoh S i, Liang C, Lin C-W. When pedestrian detection meets nighttime surveillance: A new benchmark. Image. 2020;20000:40000.

27

Lempitsky V, Zisserman A. Learning to count objects in images. Adv Neural Inf Proces Syst. 2010;23:1324–1332.

28
Liu L, Qiu Z, Li G, Liu S, Ouyang W, Lin L. Crowd Counting With Deep Structured Scale Integration Network. Paper presented at: Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV); 2020.
29
Liu L, Wang H, Li G, Ouyang W, Lin L. Crowd counting using deep recurrent spatial-aware network. Paper presented at: Proceedings of the 27th International Joint Conference on Artificial Intelligence; 2018.
30
Zhang Y, Zhou D, Chen S, Gao S, Ma Y. Single-image crowd counting via multi-column convolutional neural network. Paper presented at: Proceedings of the 2016 IEEE conference on computer vision and pattern recognition (CVPR); 2016 June 27–30; Las Vegas, NV, USA; pp. 589–597.
31

Wang R, Alotaibi R, Alzahrani B, Mahmood A, Wu G, Xia H, Alshehri A, Aldhaheri S. AAC: Automatic augmentation for crowd counting. Neurocomputing. 2022;500:90–98.

32
Cao X, Wang Z, Zhao Y, Su F, Scale aggregation network for accurate and efficient crowd counting. Paper presented at: Proceedings of the 2019 European Conference on Computer Vision (ECCV); 2019; pp. 734–750.
33
Ma Z, Wei X, Hong X, Gong Y. Bayesian loss for crowd count estimation with point supervision. Paper presented at: Proceedings of the IEEE/CVF International Conference on Computer Vision; 2019; pp. 6142–6151.
34

Liu Y, Wen Q, Chen H, Liu W, Qin J, Han G, He S. Crowd counting via cross-stage refinement networks. IEEE Trans Image Process. 2020;29:6800–6812.

35
Toha TR, Al-Nabhan NA, Salim SI, Rahaman M, Kamal U, Islam ABMAA. LC-Net: Localized Counting Network for extremely dense crowds. Appl Soft Comput. 2022;123:108930.
36

Liu Y, Wang Z, Shi M. Discovering regression-detection bi-knowledge transfer for unsupervised cross-domain crowd counting. Neurocomputing. 2022;494:418–431.

37
Thanasutives P, Fukui K-I, Numao M, Kijsirikul B. Encoderdecoder based convolutional neural networks with multiscale-aware modules for crowd counting. Paper presented at: Proceedings of the 25th International Conference on Pattern Recognition, ICPR 2020; 2021 January 10–15; Milan, Italy: Virtual; pp. 2382–2389.
38
Roth L, Barendregt C, Bétrix C-A, Hund A, Walter A. High-throughput field phenotyping of soybean: Spotting an ideotype. Remote Sens Environ. 2021;112797.
39
Chen B, Yan Z, Li K, Li P, Wang B, Zuo W, Zhang L. Variational attention: Propagating domain-specific knowledge for multi-domain learning in crowd counting. arXiv. 2021. https://doi.org/10.48550/arXiv.2108.08023.
40

Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. Adv Neural Inf Proces Syst. 2015;28:91–99.

41

Xiong H, Cao Z, Lu H, Madec S, Liu L, Shen C. TasselNetv2: In-field counting of wheat spikes with context-augmented local regression networks. Plant Methods. 2019;15:150.

42
Cai E, Baireddy S, Yang C, Delp EJ, Crawford M. Panicle counting in UAV images for estimating flowering time in sorghum. Paper presented at: Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS; 2021 July 11–16; Brussels, Belgium; pp. 6280–6283.
43
K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition. arXiv. 2014. https://doi.org/10.48550/arXiv.1409.1556.
44
Shi Z, Mettes P, Snoek CG. Counting with focus for free. Paper presented at: Proceedings of the IEEE/CVF International Conference on Computer Vision; 2019; pp. 4200–4209.
45
Liang D, Xu W, Zhu Y, Zhou Y. Focal inverse distance transform maps for crowd localization and counting in dense crowd. arXiv. 2021. https://doi.org/10.48550/arXiv.2102.07925.
46

Liu L, Lu H, Xiong H, Xian K, Shen C. Counting objects by blockwise classification. IEEE Trans Circuits Syst Video Technol. 2020;30:3513–3527.

Plant Phenomics
Article number: 0020
Cite this article:
Bai X, Liu P, Cao Z, et al. Rice Plant Counting, Locating, and Sizing Method Based on High-Throughput UAV RGB Images. Plant Phenomics, 2023, 5: 0020. https://doi.org/10.34133/plantphenomics.0020

192

Views

36

Crossref

35

Web of Science

41

Scopus

0

CSCD

Altmetrics

Received: 26 September 2022
Accepted: 09 December 2022
Published: 30 January 2023
© 2023 Xiaodong Bai et al. Exclusive Licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License (CC BY 4.0).

Return