Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Reliable and automated 3-dimensional (3D) plant shoot segmentation is a core prerequisite for the extraction of plant phenotypic traits at the organ level. Combining deep learning and point clouds can provide effective ways to address the challenge. However, fully supervised deep learning methods require datasets to be point-wise annotated, which is extremely expensive and time-consuming. In our work, we proposed a novel weakly supervised framework, Eff-3DPSeg, for 3D plant shoot segmentation. First, high-resolution point clouds of soybean were reconstructed using a low-cost photogrammetry system, and the Meshlab-based Plant Annotator was developed for plant point cloud annotation. Second, a weakly supervised deep learning method was proposed for plant organ segmentation. The method contained (a) pretraining a self-supervised network using Viewpoint Bottleneck loss to learn meaningful intrinsic structure representation from the raw point clouds and (b) fine-tuning the pretrained model with about only 0.5% points being annotated to implement plant organ segmentation. After, 3 phenotypic traits (stem diameter, leaf width, and leaf length) were extracted. To test the generality of the proposed method, the public dataset Pheno4D was included in this study. Experimental results showed that the weakly supervised network obtained similar segmentation performance compared with the fully supervised setting. Our method achieved 95.1%, 96.6%, 95.8%, and 92.2% in the precision, recall, F1 score, and mIoU for stem–leaf segmentation for the soybean dataset and 53%, 62.8%, and 70.3% in the AP, AP@25, and AP@50 for leaf instance segmentation for the Pheno4D dataset. This study provides an effective way for characterizing 3D plant architecture, which will become useful for plant breeders to enhance selection processes. The trained networks are available at https://github.com/jieyi-one/EFF-3DPSEG.
Brown TB, Cheng R, Sirault XRR, Rungrat T, Murray KD, Trtilek M, Furbank RT, Badger M, Pogson BJ, Borevitz JO. TraitCapture: Genomic and environment modelling of plant phenomic data. Curr Opin Plant Biol. 2014;18:73–79.
Tardieu F, Cabrera-Bosquet L, Pridmore T, Bennett M. Plant Phenomics, from sensors to knowledge. Curr Biol. 2017;27(15):R770–R783.
Chawade A, van Ham J, Blomquist H, Bagge O, Alexandersson E, Ortiz R. High-throughput field-phenotyping tools for plant breeding and precision agriculture. Agronomy. 2019;9(5):258.
Li L, Zhang Q, Huang D. A review of imaging techniques for plant phenotyping. Sensors. 2014;14(11):20078–20111.
Bai G, Ge Y, Hussain W, Baenziger PS, Graef G. A multi-sensor system for high throughput field phenotyping in soybean and wheat breeding. Comput Electron Agric. 2016;128:181–192.
Brewer MT, Lang L, Fujimura K, Dujmovic N, Gray S, van der Knaap E. Development of a controlled vocabulary and software application to analyze fruit shape variation in tomato and other plant species. Plant Physiol. 2006;141(1):15–25.
Walter A, Silk WK, Schurr U. Environmental effects on spatial and temporal patterns of leaf and root growth. Annu Rev Plant Biol. 2009;60(1):279–304.
Sun S, Li C, Chee PW, Paterson AH, Jiang Y, Xu R, Robertson JS, Adhikari J, Shehzad T. Three-dimensional photogrammetric mapping of cotton bolls in situ based on point cloud segmentation and clustering. ISPRS J Photogramm Remote Sens. 2020;160:195–207.
Gongal A, Amatya S, Karkee M, Zhang Q, Lewis K. Sensors and systems for fruit detection and localization: A review. Comput Electron Agric. 2015;116:8–19.
Jin S, Sun X, Wu F, Su Y, Li Y, Song S, Xu K, Ma Q, Baret F, Jiang D, et al. Lidar sheds new light on plant phenomics for plant breeding and management: Recent advances and future prospects. ISPRS J Photogramm Remote Sens. 2021;171:202–223.
Qiu R, Wei S, Zhang M, Li H, Sun H, Liu G, Li M. Sensors for measuring plant phenotyping: A review. Int J Agric Biol Eng. 2018;11(2):1–17.
Yuan H, Bennett RS, Wang N, Chamberlin KD. Development of a peanut canopy measurement system using a ground-based LiDAR sensor. Front Plant Sci. 2019;10:203.
Vázquez-Arellano M, Reiser D, Paraforos DS, Garrido-Izard M, Burce MEC, Griepentrog HW. 3-D reconstruction of maize plants using a time-of-flight camera. Comput Electron Agric. 2018;145:235–247.
Hu C, Li P, Pan Z. Phenotyping of poplar seedling leaves based on a 3D visualization method. Int J Agric Biol Eng. 2018;11:145–151.
Wu S, Wen W, Wang Y, Fan J, Wang C, Gou W, Guo X. MVS-Pheno: A portable and low-cost phenotyping platform for maize shoots using multiview stereo 3D reconstruction. Plant Phenomics. 2020;2020:1848437.
Rose JC, Paulus S, Kuhlmann H. Accuracy analysis of a multi-view stereo approach for Phenotyping of tomato plants at the organ level. Sensors. 2015;15(5):9651–9665.
Yang X, Strahler AH, Schaaf CB, Jupp DLB, Yao T, Zhao F, Wang Z, Culvenor DS, Newnham GJ, Lovell JL, et al. Three-dimensional forest reconstruction and structural parameter retrievals using a terrestrial full-waveform lidar instrument (echidna®). Remote Sens Environ. 2013;135:36–51.
Wu J, Cawse-Nicholson K, van Aardt J. 3D tree reconstruction from simulated small footprint waveform Lidar. Photogramm Eng Remote Sens. 2013;79(12):1147–1157.
Duan T, Chapman SC, Holland E, Rebetzke GJ, Guo Y, Zheng B. Dynamic quantification of canopy structure to characterize early plant vigour in wheat genotypes. J Exp Bot. 2016;67(15):4523–4534.
Jin S, Su Y, Wu F, Pang S, Gao S, Hu T, Liu J, Guo Q. Stem–leaf segmentation and phenotypic trait extraction of individual maize using terrestrial LiDAR data. IEEE Trans Geosci Remote Sens. 2019;57:1336–1346.
Shi W, van de Zedde R, Jiang H, Kootstra G. Plant-part segmentation using deep learning and multi-view vision. Biosyst Eng. 2019;187:81–95.
Li Y, Wen W, Miao T, Wu S, Yu Z, Wang X, Guo X, Zhao C. Automatic organ-level point cloud segmentation of maize shoots by integrating high-throughput data acquisition and deep learning. Comput Electron Agric. 2022;193:106702.
Li D, Shi G, Li J, Chen Y, Zhang S, Xiang S, Jin S. PlantNet: A dual-function point cloud segmentation network for multiple plant species. ISPRS J Photogramm Remote Sens. 2022;184:243–263.
Jin S, Su Y, Gao S, Wu F, Ma Q, Xu K, Ma Q, Hu T, Liu J, Pang S, et al. Separating the structural components of maize for field phenotyping using terrestrial LiDAR data and deep convolutional neural networks. IEEE Trans Geosci Remote Sens. 2020;58(4):2644–2658.
Xie S, Gu J, Guo D, Qi CR, Guibas L, Litany O. PointContrast: Unsupervised pre-training for 3D point cloud understanding. Cham: Springer International Publishing; 2020.
Wu Y, Xu L. Crop organ segmentation and disease identification based on weakly supervised deep neural network. Agronomy. 2019;9(11):737.
Zhou L, Xiao Q, Taha MF, Xu C, Zhang C. Phenotypic analysis of diseased plant leaves using supervised and weakly supervised deep learning. Plant Phenomics. 2023;5:0022.
Schunck D, Magistri F, Rosu RA, Cornelißen A, Chebrolu N, Paulus S, Léon J, Behnke S, Stachniss C, Kuhlmann H, et al. Pheno4D: A spatio-temporal dataset of maize and tomato plant point clouds for phenotyping and advanced plant analysis. PLOS ONE. 2021;16(8):e0256340.
Dutagaci H, Rasti P, Galopin G, Rousseau D. ROSE-X: An annotated data set for evaluation of 3D plant organ segmentation methods. Plant Methods. 2020;16(1):28.
Tian B, Luo L, Zhao H, Zhou G. VIBUS: Data-efficient 3D scene parsing with Viewpoint Bottleneck and uncertainty-spectrum modeling. ISPRS J Photogramm Remote Sens. 2022;194:302–318.
Miao T, Zhu C, Xu T, Yang T, Li N, Zhou Y, Deng H. Automatic stem-leaf segmentation of maize shoots using three-dimensional point cloud. Comput Electron Agric. 2021;187:106310.
Li D, Li J, Xiang S, Pan A. PSegNet: Simultaneous semantic and instance segmentation for point clouds of plants. Plant Phenomics. 2022;2022:9787643.
Distributed under a Creative Commons Attribution License (CC BY 4.0).