AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Time-Series Field Phenotyping of Soybean Growth Analysis by Combining Multimodal Deep Learning and Dynamic Modeling

Hui Yu1,2Lin Weng2Songquan Wu3Jingjing He2Yilin Yuan3Jun Wang2Xiaogang Xu2Xianzhong Feng1,2( )
Key Laboratory of Soybean Molecular Design Breeding, State Key Laboratory of Black Soils Conservation and Utilization, Northeast Institute of Geography and Agroecology, Chinese Academy of Sciences, Changchun 130102, China
Zhejiang Lab, Hangzhou 310012, China
Yanbian University, Yanji 133002, China
Show Author Information

Abstract

The rate of soybean canopy establishment largely determines photoperiodic sensitivity, subsequently influencing yield potential. However, assessing the rate of soybean canopy development in large-scale field breeding trials is both laborious and time-consuming. High-throughput phenotyping methods based on unmanned aerial vehicle (UAV) systems can be used to monitor and quantitatively describe the development of soybean canopies for different genotypes. In this study, high-resolution and time-series raw data from field soybean populations were collected using UAVs. The RGB (red, green, and blue) and infrared images are used as inputs to construct the multimodal image segmentation model—the RGB & Infrared Feature Fusion Segmentation Network (RIFSeg-Net). Subsequently, the segment anything model was employed to extract complete individual leaves from the segmentation results obtained from RIFSeg-Net. These leaf aspect ratios facilitated the accurate categorization of soybean populations into 2 distinct varieties: oval leaf type variety and lanceolate leaf type variety. Finally, dynamic modeling was conducted to identify 5 phenotypic traits associated with the canopy development rate that differed substantially among the classified soybean varieties. The results showed that the developed multimodal image segmentation model RIFSeg-Net for extracting soybean canopy cover from UAV images outperformed traditional deep learning image segmentation networks (precision = 0.94, recall = 0.93, F1-score = 0.93). The proposed method has high practical value in the field of germplasm resource identification. This approach could lead to the use of a practical tool for further genotypic differentiation analysis and the selection of target genes.

References

1

Messina MJ. Legumes and soybeans: Overview of their nutritional profiles and health effects. Am J Clin Nutr. 1999;70(3):439–450.

2

Liu SL, Zhang M, Feng F, Tian ZX. Toward a green revolution for soybean. Mol Plant. 2020;13(5):688–697.

3

Du HP, Fang C, Li YR, Kong FJ, Liu BH. Understandings and future challenges in soybean functional genomics and molecular breeding. J Integr Plant Biol. 2023;65(2):468–495.

4

Fu Z, Jiang J, Gao Y, Krienke B, Liu X. Wheat growth monitoring and yield estimation based on multi-rotor unmanned aerial vehicle. Remote Sens. 2020;12(3):508.

5

Roth L, Barendregt C, Betrix CA, Hund A, Walter A. High-throughput field phenotyping of soybean: Spotting an ideotype. Remote Sens Environ. 2021;269:Article 112797.

6

Reed RC, Bradford KJ, Khanday I. Seed germination and vigor: Ensuring crop sustainability in a changing climate. Heredity. 2022;128(6):450–459.

7

Zhao C, Zhang Y, Du J, Guo X, Wen W, Gu S, Wang J, Fan J. Crop phenomics: Current status and perspectives. Front Plant Sci. 2019;10:714.

8

Yang W, Feng H, Zhang X, Zhang J, Yan J. Crop phenomics and high-throughput phenotyping: Past decades, current challenges, and future perspectives. Mol Plant. 2020;13(2):187–214.

9

Tardieu F, Cabrera-Bosquet L, Pridmore T, Bennett M. Plant phenomics, from sensors to knowledge. Curr Biol. 2017;27(15):770–783.

10

Lobos GA, Camargo AV, Del Pozo A, Araus JL, Ortiz R, Doonan JH. Editorial: Plant phenotyping and phenomics for plant breeding. Front Plant Sci. 2017;8:02181.

11

Messina G, Modica G. Applications of UAV thermal imagery in precision agriculture: State of the art and future research outlook. Remote Sens. 2020;12(9):1491.

12

Fan J, Li Y, Yu S, Gou W, Guo X, Zhao C. Application of internet of things to agriculture the LQ-field pheno platform: A high throughput platform for obtaining crop phenotypes in field. Research. 2023;6:0059.

13

Yang G, Liu J, Zhao C, Li Z, Huang Y, Yu H, Xu B, Yang X, Zhu D, Zhang X, et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front Plant Sci. 2017;30(8):1111.

14

Schwalbert RA, Amado T, Corassa G, Pott LP, Prasad PVV, Ciampitti IA. Satellite-based soybean yield forecast: Integrating machine learning and weather data for improving crop yield prediction in southern Brazil. Agric For Meteorol. 2020;284:Article 107886.

15

Wan L, Cen HY, Zhu JP, Zhang JF, Zhu YM, Sun DW, Du XY, Zhai L, Weng HY, Li YJ, et al. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer-a case study of small farmlands in the south of China. Agric For Meteorol. 2020;291:Article 108096.

16

Lu N, Zhou J, Han Z, Li D, Cao Q, Yao X, Tian Y, Zhu Y, Cao W, Cheng T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods. 2019;15(1):17.

17

Maimaitijiang M, Sagan V, Sidike P, Daloye AM, Erkbol H, Fritschi FB. Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens. 2020;12(9):1357.

18

Jimenez-Berni JA, Deery DM, Rozas-Larraondo P, Condon ATG, Rebetzke GJ, James RA, Bovill WD, Furbank RT, Sirault XRR. High throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR. Front Plant Sci. 2018;9:237.

19

Bendig J, Yu K, Aasen H, Bolten A, Bennertz S, Broscheit J, Gnyp ML, Bareth G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int J Appl Earth Obs Geoinf. 2015;39:79–87.

20

Borra-Serrano I, Swaef TD, Quataert P, Aper J, Lootens P. Closing the phenotyping gap: High resolution UAV time series for soybean growth analysis provides objective data from field trials. Remote Sens. 2020;12(10):1644.

21

Liu X, Rahman T, Song C, Su B, Yang F, Yong T, Wu Y, Zhang C, Yang W. Changes in light environment, morphology, growth and yield of soybean in maize-soybean intercropping systems. Field Crop Res. 2017;200:38–46.

22

Wang Y, Burgess SJ, Becker EMD, Long SP. Photosynthesis in the fleeting shadows: An overlooked opportunity for increasing crop productivity. Plant J. 2020;101(4):874–884.

23

Maimaitijiang M, Sagan V, Sidike P, Hartling S, Esposito F, Fritschi FB. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens Environ. 2020;237:Article 111599.

24

Li Y, Wen W, Guo X, Yu Z, Zhao C. High-throughput phenotyping analysis of maize at the seedling stage using end-to-end segmentation network. PLoS One. 2021;16(1):Article e0241528.

25

Adrian C, Carlos S, Alejandro RR, Pascual C. A review of deep learning methods and applications for unmanned aerial vehicles. J Sens. 2017;2017:1–13.

26

Li YL, Wen WL, Miao T, Wu S, Yu ZT, Wang XD, Guo XY, Zhao CJ. Automatic organ-level point cloud segmentation of maize shoots by integrating high-throughput data acquisition and deep learning. Comput Electron Agric. 2022;193:Article 106702.

27

Li YL, Wen WL, Fan JC, Gou WB, Gu SH, Lu XJ, Yu ZT, Wang XD, Guo XY. Multi-source data fusion improves time-series phenotype accuracy in maize under a field high-throughput phenotyping platform. Plant Phenom. 2023;5:0043.

28

Duan T, Chapman SC, Guo Y, Zheng B. Dynamic monitoring of NDVI in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crop Res. 2017;210:71–80.

29

Jin SC, Su YJ, Zhang YG, Song SL, Li Q, Liu ZH, Ma Q, Ge Y, Liu LL, Ding YF, et al. Exploring seasonal and circadian rhythms in structural traits of field maize from LiDAR time series. Plant Phenom. 2021;2021:9895241.

30
He KM, Zhang XY, Ren SQ, Sun J. Deep residual learning for image recognition. Paper presented at: IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 2016 Jun 27–30; Las Vegas, NV.
31

Sun Y, Zuo W, Liu M. RTFNet: RGB-thermal fusion network for semantic segmentation of urban scenes. IEEE Robot Autom Lett. 2019;4:2576–2583.

32

Wang Z, Bovik AC, Sheikh HR, Simoncelli EP. Image quality assessment: From error visibility to structural similarity. IEEE Trans Image Process. 2004;13(4):600–612.

33

Goutte C, Gaussier E. A probabilistic interpretation of precision, recall and F-score, with implication for evaluation. Lect Notes Comput Sci. 2005;3408:952.

34

Long J, Shelhamer E, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Trans Patt Anal Mach Intell. 2015;39(4):640–651.

35

Ronneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. MICCAI. 2015;234–241.

36

Badrinarayanan V, Kendall A, Cipolla R. SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans Pattern Anal Mach Intell. 2017;39(12):2481–2495.

37

Hazirbas C, Ma L, Domokos C, Cremers D. FuseNet: Incorporating depth into semantic segmentation via fusion-based CNN architecture. Asian Conference on Computer Vision (ACCV). 2016;10111:213–228.

38
Ha Q, Watanabe K, Karasawa T, Ushiku Y, Harada T. MFNet: Towards real-time semantic segmentation for autonomous vehicles with multi-spectral scenes. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)/Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics. IEEE; 2017. p. 5108–5115.
39
Zhao HS, Shi JP, Qi XJ, Wang XG. Jia JY, Pyramid scene parsing network. In: IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE; 2017. p. 6230–6239.
40

Fan DP, Ji GP, Cheng MM, Shao L. Concealed object detection. IEEE Trans Pattern Anal Mach Intell. 2022;44(10):6024–6042.

41

Pieruschka R, Schurr U. Plant phenotyping: Past, present, and future. Plant Phenom. 2019;7507131.

42

Watt M, Fiorani F, Usadel B, Rascher U, Muller O, Schurr U. Phenotyping: New windows into the plant for breeders. Ann Rev Plant Biol. 2020;71:689–712.

43
Mazin H, Radha H. Multiscale domain adaptive YOLO for cross-domain object detection. In: IEEE International Conference on Image Processing (ICIP). IEEE; 2021. p. 3323–3327.
44
Wang J, Yang Y, Mao JH, Huang ZH, Huang C, Xu W. CNN-RNN: A unified framework for multi-label image classification. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE; 2016. p. 2285–2294.
45

Sadeghi-Tehran P, Virlet N, Sabermanesh K, Hawkesford MJ. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping. Plant Methods. 2017;13:103.

46

Liu S, Martre P, Buis S, Abichou M, Andrieu B, Baret F. Estimation of plant and canopy architectural traits using the digital plant phenotyping platform. Plant Physiol. 2019;181(3):881–890.

47

Lopezcastaneda C, Richards RA, Farquhar GD, Williamson RE. Seed and seedling characteristics contributing to variation in early vigor among temperate cereals. Crop Sci. 1996;36(5):1257–1266.

48

Zhao ZG, Rebetzke GJ, Zheng BY, Chapman SC, Wang EL. Modelling impact of early vigour on wheat yield in dryland regions. J Exp Bot. 2019;70(9):2535–2548.

49

Li Y, Zhan X, Liu S, Lu H, Jiang R, Guo W, Chapman S, Ge Y, Solan B, Ding Y, et al. Self-supervised plant phenotyping by combining domain adaptation with 3D plant model simulations: Application to wheat leaf counting at seedling stage. Plant Phenom. 2023;5:0041.

50

Yang XH, Gao SB, Xu ST, Zhang ZX, Boddupalli MP, Li L, Li JS, Yan JB. Characterization of a global germplasm collection and its potential utilization for analysis of complex quantitative traits in maize. Mol Breed 2010; 28(4):511–526.

51

Crossa J, Fritsche-Neto R, Montesinos LO, Costa-Neto G, Dreisigacker S, Montesinos-Lopez A, Bentley AR. The modern plant breeding triangle: Optimizing the use of genomics, phenomics, and enviromics data. Front Plant Sci. 2021;12:Article 651480.

52

Millet EJ, Kruijer W, Coupel-Ledru A, Alvarez Prado S, Cabrera-Bosquet L, Lacube S, Charcosset A, Welcker C, van Eeuwijk F, Tardieu F. Genomic prediction of maize yield across European environmental conditions. Nat Gen. 2019;51(6):952–956.

Plant Phenomics
Article number: 0158
Cite this article:
Yu H, Weng L, Wu S, et al. Time-Series Field Phenotyping of Soybean Growth Analysis by Combining Multimodal Deep Learning and Dynamic Modeling. Plant Phenomics, 2024, 6: 0158. https://doi.org/10.34133/plantphenomics.0158

157

Views

5

Crossref

2

Web of Science

3

Scopus

0

CSCD

Altmetrics

Received: 22 October 2023
Accepted: 21 February 2024
Published: 20 March 2024
© 2024 Hui Yu et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return