AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Efficient Noninvasive FHB Estimation using RGB Images from a Novel Multiyear, Multirater Dataset

Dominik Rößle1( )Lukas Prey2Ludwig Ramgraber3Anja Hanemann3Daniel Cremers4Patrick Ole Noack2Torsten Schön1
AImotion Bavaria, Technische Hochschule Ingolstadt, Ingolstadt, Germany
Hochschule Weihenstephan-Triesdorf, Weidenbach, Germany
Saatzucht Josef Breun GmbH and Co. KG, Herzogenaurach, Germany
Technical University of Munich, Munich, Germany
Show Author Information

Abstract

Fusarium head blight (FHB) is one of the most prevalent wheat diseases, causing substantial yield losses and health risks. Efficient phenotyping of FHB is crucial for accelerating resistance breeding, but currently used methods are time-consuming and expensive. The present article suggests a noninvasive classification model for FHB severity estimation using red–green–blue (RGB) images, without requiring extensive preprocessing. The model accepts images taken from consumer-grade, low-cost RGB cameras and classifies the FHB severity into 6 ordinal levels. In addition, we introduce a novel dataset consisting of around 3,000 images from 3 different years (2020, 2021, and 2022) and 2 FHB severity assessments per image from independent raters. We used a pretrained EfficientNet (size b0), redesigned as a regression model. The results demonstrate that the interrater reliability (Cohen’s kappa, κ) is substantially lower than the achieved individual network-to-rater results, e.g., 0.68 and 0.76 for the data captured in 2020, respectively. The model shows a generalization effect when trained with data from multiple years and tested on data from an independent year. Thus, using the images from 2020 and 2021 for training and 2022 for testing, we improved the F1W score by 0.14, the accuracy by 0.11, κ by 0.12, and reduced the root mean squared error by 0.5 compared to the best network trained only on a single year’s data. The proposed lightweight model and methods could be deployed on mobile devices to automatically and objectively assess FHB severity with images from low-cost RGB cameras. The source code and the dataset are available at https://github.com/cvims/FHB_classification.

References

1

Miedaner T, Juroszek P. Climate change will influence disease resistance breeding in wheat in northwestern Europe. Theor Appl Genet. 2021;134(6):1771–1785.

2

Alconada TM, Moure MC, Ortega LM. Fusarium infection in wheat, aggressiveness and changes in grain quality: A review. Vegetos. 2019;32(6):441–449.

3

Figueroa M, Hammond-Kosack KE, Solomon PS. A review of wheat diseases—A field perspective. Mol Plant Pathol. 2018;19(6):1523–1536.

4

Torres AM, Palacios SA, Yerkovich N, Palazzini JM, Battilani P, Leslie JF, Logrieco AF, Chulze SN. Fusarium head blight and mycotoxins in wheat: Prevention and control strategies across the food chain. World Mycotoxin J. 2019;12(4):333–355.

5
Stack RW, McMullen MP. A visual scale to estimate severity of Fusarium head blight in wheat. NDSU; November 1998. p. 1095.
6

Bock CH, Barbedo JGA, Ponte EMD, Bohnenkamp D, Mahlein A-K. From visual estimates to fully automated sensor-based measurements of plant disease severity: Status and challenges for improving accuracy. Phytopathol Res. 2020;2(1):Article 9.

7

Huang L, Wu Z, Huang W, Ma H, Zhao J. Identification of Fusarium head blight in winter wheat ears based on fisher’s linear discriminant analysis and a support vector machine. Appl Sci. 2019;9(18):Article 3894.

8

Ma H, Huang W, Jing Y, Pignatti S, Laneve G, Dong Y, Ye H, Liu L, Guo A, Jiang J. Identification of Fusarium head blight in winter wheat ears using continuous wavelet analysis. Sensors. 2020;20(1):1–15.

9

Zhang DY, Chen G, Yin X, Hu RJ, Gu CY, Pan ZG, Zhou XG, Chen Y. Integrating spectral and image data to detect Fusarium head blight of wheat. Comput Electron Agric. 2020;175:Article 105588.

10

Bauriegel E, Herppich W. Hyperspectral and chlorophyll fluorescence imaging for early detection of plant diseases, with special reference to Fusarium spec. infections on wheat. Agriculture. 2014;4(1):32–57.

11

Zhang X, Zhao J, Yang G, Liu J, Cao J, Li C, Zhao X, Gai J. Establishment of plotyield prediction models in soybean breeding programs using UAV-based hyperspectral remote sensing. Remote Sens. 2019;11(23):Article 2752.

12

Jin X, Jie L, Wang S, Qi HJ, Li SW. Classifying wheat hyperspectral pixels of healthy heads and Fusarium head blight disease using a deep neural network in the wild field. Remote Sens. 2018;10(3):Article 395.

13

Whetton RL, Waine TW, Mouazen AM. Hyperspectral measurements of yellow rust and Fusarium head blight in cereal crops: Part 2: On-line field measurement. Biosyst Eng. 2018;167:144–158.

14

Barbedo JGA. A review on the main challenges in automatic plant disease identification based on visible range images. Biosyst Eng. 2016;144:52–60.

15

Kumar D, Kukreja V. Deep learning in wheat diseases classification: A systematic review. Multimed Tools Appl. 2022;81(7):1–45.

16

Gao C, Gong Z, Ji X, Dang M, He Q, Sun H, Guo W. Estimation of Fusarium head blight severity based on transfer learning. Agronomy. 2022;12(8):1–16.

17
Deng J, Dong W, Socher R, Li L-J, Li K, Fei-Fei L. ImageNet: A large-scale hierarchical image database. Paper presented at: 2009 IEEE Conference on Computer Vision and Pattern Recognition; 2009 Jun 20–25; Miami, USA.
18
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Paper presented at: Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition; 2016 Jun 27–30; Las Vegas, USA.
19
Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. Paper presented at: International Conference on Learning Representations; 2015 Nov 03–06; Kuala Lumpur, Malaysia.
20
Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. ArXiv. 2017. https://doi.org/10.48550/arXiv.1704.04861
21

Gu C, Wang D, Zhang H, Zhang J, Zhang D, Liang D. Fusion of deep convolution and shallow features to recognize the severity of wheat Fusarium head blight. Fronti Plant Sci. 2021;11:Article 599886.

22

Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. Commun ACM. 2017;60(6):84–90.

23

Zhang D, Wang D, Gu C, Jin N, Zhao H, Chen G, Liang H, Liang D. Using neural network to identify the severity of wheat Fusarium head blight in the field environment. Remote Sens. 2019;11(20):Article 2375.

24

Qiu R, Yang C, Moghimi A, Zhang M, Steffenson BJ, Hirsch CD. Detection of Fusarium head blight in wheat using a deep neural network and color imaging. Remote Sens. 2019;11(22):2658.

25
He K, Gkioxari G, Doll´ar P, Girshick R, Mask r-cnn. Paper presented at: 2017 IEEE International Conference on Computer Vision (ICCV); 2017 Oct 22–29; Venice, Italy.
26
Lin T-Y, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dollár P, Zitnick CL. Microsoft COCO: Common objects in context. Computer Vision – ECCV 2014; Cham: Springer; 2014. p. 740–755
27

Gao Y, Wang H, Li M, Su W-H. Automatic tandem dual BlendMask networks for severity assessment of wheat Fusarium head blight. Agriculture. 2022;12(9):1493.

28

Su WH, Zhang J, Yang C, Page R, Szinyei T, Hirsch CD, Steffenson BJ. Automatic evaluation of wheat resistance to Fusarium head blight using dual mask-rcnn deep learning frameworks in computer vision. Remote Sens. 2021;13(1):1–20.

29

Hong Q, Jiang L, Zhang Z, Ji S, Gu C, Mao W, Li W, Liu T, Li B, Tan C. A lightweight model for wheat ear Fusarium head blight detection based on RGB images. Remote Sens. 2022;14(14):1–20.

30
Bochkovskiy A, Wang C, Liao HM. Yolov4: Optimal speed and accuracy of object detection. ArXiv. 2020. https://doi.org/10.48550/arXiv.2004.10934
31

Xiao Y, Dong Y, Huang W, Liu L, Ma H. Wheat Fusarium head blight detection using uav-based spectral and texture features in optimal window size. Remote Sens. 2021;13(13):Article 2437.

32

Bock CH, Poole GH, Parker PE, Gottwald TR. Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging. Crit Rev Plant Sci. 2010;29(2):59–107.

33

Nutter FW Jr. Assessing the accuracy, intra-rater repeatability, and inter-rater reliability of disease assessment systems. Phytopathol. 1993;83(8):806–812.

34

Shorten C, Khoshgoftaar TM. A survey on image data augmentation for deep learning. J Big Data. 2019;6(1):Article 60.

35

Fernandez-Campos M, Huang YT, Jahanshahi MR, Wang T, Jin J, Telenko DE, Gongora-Canul C, Cruz CD. Wheat spike blast image classification using deep convolutional neural networks. Front Plant Sci. 2021;12:Article 673505.

36
Tan M, Le Q. EfficientNet: Rethinking model scaling for convolutional neural networks. In: K. Chaudhuri and R. Salakhutdinov, editors. International conference on machine learning. Long Beach: PMLR; 2019. pp. 6105–6114.
37

Genaev MA, Skolotneva ES, Gultyaeva EI, Orlova EA, Bechtold NP, Afonnikov DA. Image-based wheat fungi diseases identification by deep learning. Plants. 2021;10(8):1500.

38

Divyanth LG, Marzougui A, González-Bernal MJ, McGee RJ, Rubiales D, Sankaran S. Evaluation of effective class-balancing techniques for CNN-based assessment of Aphanomyces root rot resistance in pea (Pisum sativum L.). Sensors. 2022;22(19):7237.

39

Abbas I, Liu J, Amin M, Tariq A, Tunio MH. Strawberry fungal leaf scorch disease identification in real-time strawberry field using deep learning architectures. Plan Theory. 2021;10(12):2643.

40

Prey L, Hanemann A, Ramgraber L, Seidl-Schulz J, Noack PO. UAV-based estimation of grain yield for plant breeding: Applied strategies for optimizing the use of sensors, vegetation indices, growth stages, and machine learning algorithms. Remote Sens. 2022;14(24):6345.

41
Bundessortenamt, Richtlinien für die Durchführung von landwirtschaftlichen Wertprüfungen und Sortenversuchen. In: Richtlinien für die Durchführung von landwirtschaftlichen Wertprüfungen und Sortenversuchen. 2000. pp. 1–348; https://www.bundessortenamt.de/bsa/media/Files/Richtlinie_LW2000.pdf.
42
Kingma DP, Ba J. Adam: A method for stochastic optimization. Paper presented at: ICLR 2015. Proceedings of the 3rd International Conference on Learning Representations; 2015 May 7–9; San Diego, USA.
43

Ci T, Liu Z, Wang Y. Assessment of the degree of building damage caused by disaster using convolutional neural networks in combination with ordinal regression. Remote Sens. 2019;11(23):2858.

44

Cohen J. A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960;20(1):37–46.

45

Cicchetti DV, Allison T. A new procedure for assessing reliability of scoring EEG sleep recordings. Am J EEG Technol. 1971;11(3):101–110.

46

Sokolova M, Lapalme G. A systematic analysis of performance measures for classification tasks. Inf Process Manag. 2009;45(4):427–437.

47
Plevris V, Solorzano G, Bakas N, Seghier MB, Investigation of performance metrics in regression analysis and machine learning-based prediction models. Paper presented at: 8th European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS Congress 2022). 2022 Nov 24.
48

Barbedo JGA. Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification. Comput Electron Agric. 2018;153:46–53.

49

Yang Q, Duan S, Wang L. Efficient identification of apple leaf diseases in the wild using convolutional neural networks. Agronomy. 2022;12(11):2784.

50

Ilyas T, Jin H, Siddique MI, Lee SJ, Kim H, Chua L. DIANA: A deep learning-based paprika plant disease and pest phenotyping system with disease severity analysis. Front Plant Sci. 2022;13:983625.

51
Mirza M, Osindero S. Conditional generative adversarial nets. arXiv. 2014. https://doi.org/10.48550/arXiv.1411.1784.
Plant Phenomics
Article number: 0068
Cite this article:
Rößle D, Prey L, Ramgraber L, et al. Efficient Noninvasive FHB Estimation using RGB Images from a Novel Multiyear, Multirater Dataset. Plant Phenomics, 2023, 5: 0068. https://doi.org/10.34133/plantphenomics.0068

185

Views

6

Crossref

4

Web of Science

6

Scopus

0

CSCD

Altmetrics

Received: 25 January 2023
Accepted: 19 June 2023
Published: 14 July 2023
© 2023 Dominik Rößle et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return