AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Standardizing and Centralizing Datasets for Efficient Training of Agricultural Deep Learning Models

Amogh Joshi1,2,3Dario Guevara1,2,3Mason Earles1,2,3( )
Department of Viticulture and Enology, University of California, Davis, Davis, CA, USA
Department of Biological and Agricultural Engineering, University of California, Davis, Davis, CA, USA
AI Institute for Next-Generation Food Systems (AIFS), University of California, Davis, Davis, CA, USA
Show Author Information

Abstract

In recent years, deep learning models have become the standard for agricultural computer vision. Such models are typically fine-tuned to agricultural tasks using model weights that were originally fit to more general, non-agricultural datasets. This lack of agriculture-specific fine-tuning potentially increases training time and resource use, and decreases model performance, leading to an overall decrease in data efficiency. To overcome this limitation, we collect a wide range of existing public datasets for 3 distinct tasks, standardize them, and construct standard training and evaluation pipelines, providing us with a set of benchmarks and pretrained models. We then conduct a number of experiments using methods that are commonly used in deep learning tasks but unexplored in their domain-specific applications for agriculture. Our experiments guide us in developing a number of approaches to improve data efficiency when training agricultural deep learning models, without large-scale modifications to existing pipelines. Our results demonstrate that even slight training modifications, such as using agricultural pretrained model weights, or adopting specific spatial augmentations into data processing pipelines, can considerably boost model performance and result in shorter convergence time, saving training resources. Furthermore, we find that even models trained on low-quality annotations can produce comparable levels of performance to their high-quality equivalents, suggesting that datasets with poor annotations can still be used for training, expanding the pool of currently available datasets. Our methods are broadly applicable throughout agricultural deep learning and present high potential for substantial data efficiency improvements.

References

1

Sa I, Ge Z, Dayoub F, Upcroft B, Perez T, McCool C. DeepFruits: A fruit detection system using deep neural networks. Sensors. 2016;16:1222.

2
Bargoti S, Underwood J. Deep fruit detection in orchards. Paper presented at: 2017 IEEE International Conference on Robotics and Automation (ICRA); 29 May 2017–3 June 2017; Singapore.
3

Jeon HY, Tian LF, Zhu H. Robust crop and weed segmentation under uncontrolled outdoor illumination. Sensors (Basel). 2011;11:6270–6283.

4
Milioto A, Lottes P, Stachniss C. Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. Paper presented at: IEEE International Conference on Robotics and Automation (ICRA); 2018 May 21–25; Brisbane, QLD, Australia.
5

Sharada PM, Hughes DP, Salathé M. Using deep learning for image-based plant disease detection. Front Plant Sci. 2016;7:1419.

6

Ümit A, Uçar M, Akyol K, Uçar E. Plant leaf disease classification using EfficientNet deep learning model. Ecol Inform. 2021;61:101182.

7

Elsherbiny O, Zhou L, Feng L, Qiu Z. Integration of visible and thermal imagery with an artificial neural network approach for robust forecasting of canopy water content in Rice. Remote Sens. 2021;13(9):1785.

8

Wang C, Liu B, Liu L, Zhu Y, Hou J, Liu P, Li X P. A review of deep learning used in the hyperspectral image analysis for agriculture. Artif Intell Rev. 2021;54:5205–5253.

9

Nevavuori P, Narra N, Linna P, Lipping T. Crop yield prediction using multitemporal UAV data and Spatio-temporal deep learning models. Remote Sens. 2020;12(23):2000.

10

Li M, Zhang Z, Lei L, Wang X, Guo X. Agricultural greenhouses detection in high-resolution satellite images based on convolutional neural networks: Comparison of faster R-CNN, YOLO v3 and SSD. Sensors (Basel). 2020;20:4938.

11

Pan SJ, Yang Q. A survey on transfer learning. IEEE Trans Knowl Data Eng. 2010;22:1345–1359.

12
Deng J, Dong W, Socher R, Li L-J, Li K, Fei-Fei L. ImageNet: A large-scale hierarchical image database. Paper presented at: IEEE Conference on Computer Vision and Pattern Recognition; 2009 Jun 20–25; Miami, FL, USA.
13
Lin T-Y, Maire M, Belongie SJ, Bourdev LD, Girshick RB, Hays J, Perona P, Ramanan D, Dollár P, Zitnick CL. Microsoft COCO: Common Objects in Context. 2014. CoRR abs/1405.0312.
14

Nowakowski A, Mrziglod J, Spiller D, Bonifacio R, Ferrari I, Mathieu PP, Garcia-Herranz M, Kim D-H. Crop type mapping by using transfer learning. Intl J Appl Earth Observ Geoinform. 2021;98:102313.

15

Moon T, Son JE. Knowledge transfer for adapting pre-trained deep neural models to predict different greenhouse environments based on a low quantity of data. Comput Electron Agric. 2021;185:106136.

16
Shrivastava VK, Pradhan MK, Thakur MP. Application of pre-trained deep convolutional neural networks for Rice Plant disease classification. Paper presented at: International Conference on Artificial Intelligence and Smart Systems (ICAIS); 2021 Mar 25–27; Coimbatore, India.
17

Zheng Y-Y, Kong J-L, Jin X-B, Wang X-Y, ST-L, Zuo M. CropDeep: The crop vision dataset for deep-learning-based classification and detection in precision agriculture. Sensors (Basel). 2019;19(5):1058.

18

Sahili ZA, Awad M. The power of transfer learning in agricultural applications: AgriNet. Front Plant Sci. 2022;13:992700.

19

Shorten C, Khoshgoftaar TM. A survey on image data augmentation for deep learning. J Big Data. 2019;6:60.

20

Wang L, Wang J, Liu Z, Zhu J, Qin F. Evaluation of a deep-learning model for multispectral remote sensing of land use and crop classification. The Crop J. 2022;10(5):1435–1451.

21

Moreira G, Magalhães SA, Pinho T, dos Santos FN, Cunha M. Benchmark of deep learning and a proposed HSV colour space models for the detection and classification of greenhouse tomato. Agronomy. 2022;12(2):356.

22

Lu Y, Young S. A survey of public datasets for computer vision tasks in precision agriculture. Comput Electron Agric. 2020;178:105760.

23
Cordts M, Omran M, Ramos S, Rehfeld T, Enzweiler M, Benenson R, Franke U, Roth S, Schiele B. The Cityscapes dataset for semantic urban scene understanding. 2016. CoRR abs/1604.01685.
24
Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, etal. PyTorch: An imperative style, high-performance deep learning library. Paper presented at: Proceedings of the 33rd International Conference on Neural Information Processing Systems; 2019 Dec 8; Red Hook, NY, USA.
25
Sudre CH, Li W, Vercauteren T, Ourselin S, Jorge Cardoso M. Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations. Paper presented at: Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support. 2017 Sep. p. 240–248.
26
Lin T-Y, Goyal P, Girshick RB, He K, Dollár P. Focal loss for dense object detection. 2017. CoRR abs/170802002.
27
Falcon W. Pytorch lightning. GitHub. Note: https://github.com/PyTorchLightning/pytorch-lightning 3. 2019.
28

Buslaev A, Iglovikov VI, Khvedchenya E, Parinov A, Druzhinin M, Kalinin AA. Albumentations: Fast and flexible image augmentations. Information. 2020;11(2):125.

29

Too EC, Yujian L, Njuki S, Yingchun L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput Electron Agric. 2019;161:272–279.

30

Bresilla K, Perulli GD, Boini A, Morandi B, Grappadelli LC, Manfrini L. Single-shot convolution neural networks for real-time fruit detection within the tree. Front Plant Sci. 2019;10:611.

31

Sodjinou SG, Mohammadi V, Mahama ATS, Gouton P. A deep semantic segmentation-based algorithm to segment crops and weeds in agronomic color images. Inform Process Agric. 2022;9(3):355–364.

32

Peng Y, Wang A, Liu J, Faheem M. A comparative study of semantic segmentation models for identification of grape with different varieties. Agriculture. 2021;11(10):997.

33
Fei Z, Olenskyj A, Bailey BN, Earles M. Enlisting 3D crop models and GANs for more data efficient and generalizable fruit detection. Paper presented at: IEEE/CVF International Conference on Computer Vision Workshops ICCVW; 2021. p. 1269–1277.
34

Zhang W, Chen K, Wang J, Shi Y, Guo W. Easy domain adaptation method for filling the species gap in deep learning-based fruit detection. Horticul Res. 2021;8:1–13.

35

Su D, Kong H, Qiao Y, Sukkarieh S. Data augmentation for deep learning based semantic segmentation and crop-weed classification in agricultural robotics. Comput Electron Agric. 2021;190:106418.

36

Beck MA, Liu C-Y, Bidinosti CP, Henry CJ, Godee CM, Ajmani M. An embedded system for the automated generation of labeled plant images to enable machine learningapplications in agriculture. PLOS ONE. 2020;15:e0243923.

37
Fatima T, Mahmood T. Semi-supervised learning in smart agriculture: A systematic literature review. Paper presented at: 2021 6th international multi-topic ICT conference (IMTIC); 2021 Nov 10–12; Jamshoro & Karachi, Pakistan.
38
Ciarfuglia TA., Motoi IM, Saraceni L, Nardi D. Pseudo-label generation for agricultural robotics applications. Paper presented at: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops; 2022 Jun 19–20; New Orleans, LA, USA.
39
Giselsson TM, Jørgensen RN, Jensen PK, Dyrmann M, Midtiby HS. A public image database for benchmark of plant seedling classification algorithms. arXiv . 2017. https://doi.org/10.48550/arXiv.1711.05458
40

Santos Ferreira, Alessandrodos, Daniel Matte Freitas, Gercina Gonalves da Silva, Hemerson Pistori, and Marcelo Theophilo Folhes. Weed detection in soybean crops using ConvNets. Comput Electron Agric. 2017;143:314–324.

41

Alencastre-Miranda M, Davidson JR, Johnson RM, Waguespack H, Krebs HI. Robotics for sugarcane cultivation: Analysis of billet quality using computer vision. IEEE Robot Autom Lett. 2018;3(4):3828–3835.

42

Espejo-Garcia B, Mylonas N, Athanasakos L, Fountas S, Vasilakoglou I. Towards weeds identification assistance through transfer learning. Comput Electron Agric. 2020;171:105306.

43

Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Johns J, Banks W, Girgenti B, Kenny O, Whinney J, et al. DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci Rep. 2019;9:2058.

44

Teimouri N, Dyrmann M, Nielsen PR, Mathiassen SK, Somerville GJ, Jørgensen RN. Weed growth stage estimator using deep convolutional neural networks. Sensors. 2018;18(5):1580.

45
Hughes DP, Salath’e M. An open access repository of images on plant health to enable the development of mobile disease diagnostics through machine learning and crowdsourcing. 2015. CoRR abs/1511.08060.
46
Singh D, Jain N, Jain P, Kayal P, Kumawat S, Batra N. PlantDoc: A Dataset for Visual Plant Disease Detection. Paper presented at: Proceedings of the 7th ACM IKDD CoDS and 25th COMAD: Association for Computing Machinery; 2020 Jan 15; New York, NY, USA.
47
Haug S, Ostermann J. A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In: Agapito L, Bronstein MM, Rother C, editors. Computer Vision—ECCV 2014 Workshops. ECCV 2014. Lecture Notes in Computer Science. Cham: Springer International Publishing; 2015. p. 105–116.
48

Sa I, Chen Z, Popović M, Khanna R, Liebisch F, Nieto J, Siegwart R. weedNet: Dense semantic weed classification using multispectral images and MAV for smart farming. IEEE Robot Autom Lett. 2018;3(1):588–595.

49

Dias PA, Tabb A, Medeiros H. Multispecies fruit flower detection using a refined semantic segmentation network. IEEE Robot Autom Lett. 2018;3:3003–3010.

50

Häni N, Roy P, Isler V. MinneApple: A benchmark dataset for apple detection and segmentation. IEEE Robot Autom Lett. 2019;5(2):852–858.

51

Khan A, Ilyas T, Umraiz M, Mannan ZI, Kim H. CED-net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics. 2020;9(10):1602.

52
Kalampokas T, Tziridis K, Nikolaou A, Vrochidou E, Papakostas GA, Pachidis T, Kaburlasos VG. Semantic segmentation of vineyard images using convolutional neural networks. In: Iliadis L, Angelov PP, Jayne C, Pimenidis E, editors. Proceedings of the 21st EANN (Engineering Applications of Neural Networks) 2020 Conference. Cham: Springer International Publishing; 2020. p. 292–303.
53
Karkee M, Bhusal S, Zhang Q. Apple dataset benchmark from orchard environment in modern fruiting wall; 2019.
54

Gené-Mola J, Vilaplana V, Rosell-Polo JR, Morros J-R, Ruiz-Hidalgo J, Gregorio E. KFuji RGB-DS database: Fuji apple multi-modal images for fruit detection with color, depth and range-corrected IR data. Data Brief. 2019;25:104289.

55
Santos TT, Gebler L. A methodology for detection and localization of fruits in apples orchards from aerial images. 2021. CoRR abs/2110.12331.
56

Koirala A, Walsh KB, Wang Z, McCarthy C. Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘MangoYOLO’. Precis Agric. 2019;20:1107–1135.

57
AI, Plant and Biophysics Lab, Grape Detection 2019 Day. 2019.
Plant Phenomics
Article number: 0084
Cite this article:
Joshi A, Guevara D, Earles M. Standardizing and Centralizing Datasets for Efficient Training of Agricultural Deep Learning Models. Plant Phenomics, 2023, 5: 0084. https://doi.org/10.34133/plantphenomics.0084

212

Views

6

Crossref

5

Web of Science

7

Scopus

0

CSCD

Altmetrics

Received: 23 January 2023
Accepted: 02 August 2023
Published: 06 September 2023
© 2023 Amogh Joshi et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return