AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Small and Oriented Wheat Spike Detection at the Filling and Maturity Stages Based on WheatNet

Jianqing Zhao1,2,Yucheng Cai1,2,Suwan Wang1,2Jiawei Yan1,2Xiaolei Qiu1,2Xia Yao1,2,3Yongchao Tian1,4Yan Zhu1,2Weixing Cao1,2Xiaohu Zhang1,2,4( )
National Engineering and Technology Center for Information Agriculture, Nanjing Agricultural University, Nanjing 210095, China
Key Laboratory for Crop System Analysis and Decision Making, Ministry of Agriculture and Rural Affairs, Nanjing 210095, China
Jiangsu Key Laboratory for Information Agriculture, Nanjing 210095, China
Jiangsu Collaborative Innovation Center for Modern Crop Production, Nanjing 210095, China

†These authors contributed equally to this work.

Show Author Information

Abstract

Accurate wheat spike detection is crucial in wheat field phenotyping for precision farming. Advances in artificial intelligence have enabled deep learning models to improve the accuracy of detecting wheat spikes. However, wheat growth is a dynamic process characterized by important changes in the color feature of wheat spikes and the background. Existing models for wheat spike detection are typically designed for a specific growth stage. Their adaptability to other growth stages or field scenes is limited. Such models cannot detect wheat spikes accurately caused by the difference in color, size, and morphological features between growth stages. This paper proposes WheatNet to detect small and oriented wheat spikes from the filling to the maturity stage. WheatNet constructs a Transform Network to reduce the effect of differences in the color features of spikes at the filling and maturity stages on detection accuracy. Moreover, a Detection Network is designed to improve wheat spike detection capability. A Circle Smooth Label is proposed to classify wheat spike angles in drone imagery. A new micro-scale detection layer is added to the network to extract the features of small spikes. Localization loss is improved by Complete Intersection over Union to reduce the impact of the background. The results show that WheatNet can achieve greater accuracy than classical detection methods. The detection accuracy with average precision of spike detection at the filling stage is 90.1%, while it is 88.6% at the maturity stage. It suggests that WheatNet is a promising tool for detection of wheat spikes.

References

1
FAOSTAT. Agriculture Organization of the United Nations. [accessed 22 Dec 2022] https://www.fao.org/faostat/en/.
2

Weiss M, Jacob F, Duveiller G. Remote sensing for agricultural applications: A meta-review. Remote Sens Environ. 2020;236:Article 111402.

3

Cisternas I, Velásquez I, Caro A, Rodríguez A. Systematic literature review of implementations of precision agriculture. Comput Electron Agric. 2020;176:Article 105626.

4

Buetti S, Xu J, Lleras A. Predicting how color and shape combine in the human visual system to direct attention. Sci Rep. 2019;9:20258.

5

Fernandez-Gallego JA, Kefauver SC, Gutiérrez NA, Nieto-Taladriz MT, Araus JL. Wheat ear counting in-field conditions: High throughput and low-cost approach using RGB images. Plant Methods. 2018;14:22.

6

Zhou C, Liang D, Yang X, Yang H, Yue J, Yang G. Wheat ears counting in field conditions based on multi-feature optimization and TWSVM. Front Plant Sci. 2018;9:1024.

7
Cointault F, Gouton P. Texture or color analysis in agronomic images for wheat ear counting. Paper presented at: Proceedings of the 2007 Third International IEEE Conference on Signal-Image Technologies and Internet-Based System; 2007 Dec 16–18; Shanghai, China.
8

Li Q, Cai J, Berger B, Okamoto M, Miklavcic S. Detecting spikes of wheat plants using neural networks with Laws texture energy. Plant Methods. 2017;13:83.

9

Xiong H, Cao Z, Lu H, Madec S, Liu L, Shen C. TasselNetv2: In-field counting of wheat spikes with context-augmented local regression networks. Plant Methods. 2019;15:150.

10

Misra T, Arora A, Marwaha S, Chinnusamy V, Rao AR, Jain R, Sahoo RN, Ray M, Kumar S, Raju D, et al. SpikeSegNet—A deep learning approach utilizing encoder-decoder network with hourglass for spike segmentation and counting in wheat plant from visual imaging. Plant Methods. 2020;16:40.

11

Jiang Y, Li C. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenomics. 2020;2020:4152816.

12

Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. Adv Neural Inf Proces Syst. 2015;28.

13

Dai J, Li Y, He K, Sun J. R-FCN: Object detection via region-based fully convolutional networks. Adv Neural Inf Proces Syst. 2016;29.

14

Tan M, Le Q. Efficientnet: Rethinking model scaling for convolutional neural networks. Int Conf Mach Learn. 2019;2019:6105–6114.

15

Lin T, Goyal P, Girshick R, He K, Dollár P. Focal loss for dense object detection. Proc IEEE Int Conf Comp Vision. 2017;2017:2980–2988.

16
Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, Berg AC. SSD: Single shot multibox detector. In: Computer Vision–ECCV 2016: 14th European Conference. Amsterdam (The Netherlands): Springer International Publishing; 2016. p. 21–37.
17

Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. Proc IEEE/CVF Conf Comp Vision Pattern Recognit. 2016;2016:779–788.

18
Redmon J, Farhadi A. YOLO9000: Better, faster, stronger. Paper presented at: Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 2017 Jul 21–26; Honolulu, HI.
19
Redmon J, Farhadi A. Yolov3: An incremental improvement. arXiv. 2018. https://doi.org/10.48550/arXiv.1804.02767
20
Bochkovskiy A, Wang CY, Liao HYM. Yolov4: Optimal speed and accuracy of object detection. arXiv. 2020. https://doi.org/10.48550/arXiv.2004.10934
21

Zhu Y, Cao Z, Lu H, Li Y, Xiao Y. In-field automatic observation of wheat heading stage using computer vision. Biosyst Eng. 2016;143:28–41.

22

Ma J, Shao W, Ye H, Wang L, Wang H, Zheng Y, Xue X. Arbitrary-oriented scene text detection via rotation proposals. IEEE Trans Multimed. 2018;20(11):3111–3122.

23

Wang D, Zhang D, Yang G, Xu B, Luo Y, Yang X. SSRNet: In-field counting wheat ears using multi-stage convolutional neural network. IEEE Trans Geosci Remote Sens. 2021;60:1–11.

24

Hasan M, Chopin J, Laga H, Miklavcic S. Detection and analysis of wheat spikes using convolutional neural networks. Plant Methods. 2018;14(1):1–13.

25

Li J, Li C, Fei S, Ma C, Chen W, Ding F, Wang Y, Li Y, Shi J, Xiao Z. Wheat ear recognition based on RetinaNet and transfer learning. Sensors. 2021, 2021;21(14):4845.

26

Xu X, Li H, Yin F, Xi L, Qiao H, Ma Z, Shen S, Jiang B, Ma X. Wheat ear counting using K-means clustering segmentation and convolutional neural network. Plant Methods. 2020;16:106.

27

Chen J, Chen J, Zhang D, Sun Y, Nanehkaran YA. Using deep transfer learning for image-based plant disease identification. Comput Electron Agric. 2020;173:Article 105393.

28

Coulibaly S, Kamsu-Foguem B, Kamissoko D, Traore D. Deep neural networks with transfer learning in millet crop images. Comput Ind. 2019;108:115–120.

29

Ma J, Li Y, Liu H, Wu Y, Zhang L. Towards improved accuracy of UAV-based wheat ears counting: A transfer learning method of the ground-based fully convolutional network. Expert Syst Appl. 2022;191:Article 116226.

30

Weiss K, Khoshgoftaar T, Wang D. A survey of transfer learning. J Big Data. 2016;3(1):1–40.

31

Zhao J, Zhang X, Yan J, Qiu X, Yao X, Tian Y, Cao W. A wheat spike detection method in UAV images based on improved YOLOv5. Remote Sens. 2021;13(16):3095.

32

Zhao J, Yan J, Xue T, Wang S, Qiu X, Yao X, Tian Y, Zhu Y, Cao W, Zhang X. A deep learning method for oriented and small wheat spike detection (OSWSDet) in UAV images. Comput Electron Agric. 2022;2022(198):Article 107087.

33
Cgvict. roLabelImg. Git code. 2020. https://github.com/cgvict/roLabelImg.
34

Szegedy C, Ioffe S, Vanhoucke V, Alemi A. Inception-v4, inception-ResNet and the impact of residual connections on learning. Proc AAAI Conf Artif Intell. 2017;4278–4284.

35

Hu J, Shen L, Sun G. Squeeze-and-excitation networks. Proc IEEE Conf Comput Vis Pattern Recognit. 2018;2018:7132–7141.

36

Passalis N, Tefas A. Learning bag-of-features pooling for deep convolutional neural networks. Proc IEEE Int Conf Comp Vision. 2017;2017:5755–5763.

37
Ultralytics. YOLOv5. Git code. 2022. https://github.com/ultralytics/yolov5
38

Yang X, Yan J. Arbitrary-oriented object detection with circular smooth label. Eur Conf Comp Vision. 2020;2020:677–694.

39

Zheng Z, Wang P, Ren D, Liu W, Ye R, Hu Q, Zuo W. Enhancing geometric factors in model learning and inference for object detection and instance segmentation. IEEE Trans Cybern. 2021;52(8):8574–8586.

40
Jiang Y, Zhu X, Wang X, Yang S, Li W, Wang H, Fu P, Luo Z. R2CNN: Rotational region CNN for orientation robust scene text detection. arXiv. 2017. https://doi.org/10.48550/arXiv.1706.09579
41
Yang X, Liu Q, Yan J, Li A, Zhang Z, Yu G. R3det: Refined single-stage detector with feature refinement for rotating object. arXiv. 2021. https://doi.org/10.48550/arXiv.1908.05612
42
Liu Z, Hu J, Weng L, Yang Y, Rotated region-based CNN for ship detection. In: Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP). Beijing (China): IEEE; 2017. p. 900–904.
43

Ming Q, Miao L, Zhou Z, Yang X, Dong Y. Optimization for arbitrary-oriented object detection via representation invariance loss. IEEE Geosci Remote Sens Lett. 2021;19:8021505.

44

Yang X, Yang J, Yan J, Zhang Y, Zhang T, Guo Z, Sun X, Fu K. Scrdet: Towards more robust detection for small, cluttered and rotated objects. Proc IEEE/CVF Int Conf Comp Vision. 2019;2019:8232–8241.

45
Qian W, Yang X, Peng S, Guo Y, Yan J. Learning modulated loss for rotated object detection. arXiv. 2021.
46

Bai X, Cao Z, Zhao L, Zhang J, Lv C, Li C, Xie J. Rice heading stage automatic observation by multi-classifier cascade based rice spike detection method. Agric For Meteorol. 2018;259:260–270.https://doi.org/10.48550/arXiv.1911.08299

47

Tan S, Lu H, Yu J, Lan M, Hu X, Zheng H, Peng Y, Wang Y, Li Z, Qi L, et al. In-field rice panicles detection and growth stages recognition based on RiceRes2Net. Comput Electron Agric. 2023;206:Article 107704.

48

Ma J, Li Y, Du K, Zheng F, Zhang L, Gong Z, Jiao W. Segmenting ears of winter wheat at flowering stage using digital images and deep learning. Comput Electron Agric. 2020;2020(168):Article 105159.

49

Wang X, Cai Z, Gao D, Vasconcelos N. Towards universal object detection by domain attention. Proc IEEE/CVF Conf Comp Vision Pattern Recognit. 2019;2019:7289–7298.

Plant Phenomics
Article number: 0109
Cite this article:
Zhao J, Cai Y, Wang S, et al. Small and Oriented Wheat Spike Detection at the Filling and Maturity Stages Based on WheatNet. Plant Phenomics, 2023, 5: 0109. https://doi.org/10.34133/plantphenomics.0109

143

Views

14

Crossref

11

Web of Science

13

Scopus

0

CSCD

Altmetrics

Received: 20 March 2023
Accepted: 27 September 2023
Published: 30 October 2023
© 2023 Jianqing Zhao et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return