AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Predicting and Visualizing Citrus Color Transformation Using a Deep Mask-Guided Generative Network

Zehan Bao1,Weifu Li1,2,Jun Chen1,2Hong Chen1,2Vijay John3Chi Xiao4( )Yaohui Chen5( )
College of Informatics, Huazhong Agricultural University, Wuhan 430070, China
Engineering Research Center of Intelligent Technology for Agriculture, Ministry of Education, Wuhan, China
RIKEN, Guardian robot project, 2-2-2 Hikaridai Seika-cho, Sorakugun, 619-0288 Kyoto, Japan
Key Laboratory of Biomedical Engineering of Hainan Province, School of Biomedical Engineering, Hainan University, Haikou 570100, China
College of Engineering, Huazhong Agricultural University, 430070 Wuhan, China

†These authors contributed equally to this work.

Show Author Information

Abstract

Citrus rind color is a good indicator of fruit development, and methods to monitor and predict color transformation therefore help the decisions of crop management practices and harvest schedules. This work presents the complete workflow to predict and visualize citrus color transformation in the orchard featuring high accuracy and fidelity. A total of 107 sample Navel oranges were observed during the color transformation period, resulting in a dataset containing 7,535 citrus images. A framework is proposed that integrates visual saliency into deep learning, and it consists of a segmentation network, a deep mask-guided generative network, and a loss network with manually designed loss functions. Moreover, the fusion of image features and temporal information enables one single model to predict the rind color at different time intervals, thus effectively shrinking the number of model parameters. The semantic segmentation network of the framework achieves the mean intersection over a union score of 0.9694, and the generative network obtains a peak signal-to-noise ratio of 30.01 and a mean local style loss score of 2.710, which indicate both high quality and similarity of the generated images and are also consistent with human perception. To ease the applications in the real world, the model is ported to an Android-based application for mobile devices. The methods can be readily expanded to other fruit crops with a color transformation period. The dataset and the source code are publicly available at GitHub.

References

1

Xu J, Zhang Y, Zhang P, Trivedi P, Riera N, Wang Y, Liu X, Fan G, Tang J, Coletta-Filho HD, et al. The structure and function of the global citrus rhizosphere microbiome. Nat Commun. 2018;9:4894.

2
United States Department of Agriculture. Citrus: World markets and trade. 2022. https://apps.fas.usda.gov/psdonline/circulars/citrus.pdf.
3

Gupta AK, Pathak U, Tongbram T, Medhi M, Terdwongworakul A, Magwaza LS, Mditshwa A, Chen T, Mishra P. Emerging approaches to determine maturity of citrus fruit. Crit Rev Food Sci Nutr. 2022;62:5245.

4

Rodrigo MJ, Alquézar B, Alós E, Medina V, Carmona L, Bruno M, al-Babili S, Zacarías L. A novel carotenoid cleavage activity involved in the biosynthesis of citrus fruit-specific apocarotenoid pigments. J Exp Bot. 2013;64:4461.

5

Hussain SB, Shi C-Y, Guo L-X, Kamran HM, Sadka A, Liu Y-Z. Recent advances in the regulation of citric acid metabolism in citrus fruit. Crit Rev Plant Sci. 2017;36:241.

6

Obenland D, Collin S, Mackey B, Sievert J, Fjeld K, Arpaia ML. Determinants of flavor acceptability during the maturation of navel oranges. Postharvest Biol Technol. 2009;52:156.

7

Osco LP, Nogueira K, Marques Ramos AP, Faita Pinheiro MM, Furuya DEG, Gonçalves WN, de Castro Jorge LA, Marcato Junior J, dos Santos JA. Semantic segmentation of citrus-orchard using deep neural networks and multispectral uav-based imagery. Precis Agric. 2021;22:1171.

8

Ampatzidis Y, Partel V. Uav-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sens. 2019;11:410.

9

Liu T-H, Ehsani R, Toudeshki A, Zou X-J, Wang H-J. Detection of citrus fruit and tree trunks in natural environments using a multi-elliptical boundary model. Comput Ind. 2018;99:9.

10

Ozdarici-Ok A. Automatic detection and delineation of citrus trees from vhr satellite imagery. Int J Remote Sens. 2015;36:4275.

11

Zhang W, Wang J, Liu Y, Chen K, Li H, Duan Y, Wu W, Shi Y, Guo W. Deep-learning-based in-field citrus fruit detection and tracking. Hortic Res. 2022;9:uhac003.

12

Liu C, Feng Q, Tang Z, Wang X, Geng J, Xu L. Motion planning of the citrus-picking manipulator based on the TO-RRT algorithm. Agriculture. 2022;12(5):581.

13

Chen Y, An X, Gao S, Li S, Kang H. A deep learning-based vision system combining detection and tracking for fast on-line citrus sorting. Front Plant Sci. 2021;12:622062.

14

Khanramaki M, Askari Asli-Ardeh E, Kozegar E. Citrus pests classification using an ensemble of deep learning models. Comput Electron Agric. 2021;186:106192.

15

Wheatley MS, Duan Y-P, Yang Y. Highly sensitive and rapid detection of citrus huanglongbing pathogen (‘candidatus liberibacter asiaticus’) using cas12a-based methods. Phytopathology. 2021;111:2375.

16
Jiménez-Cuesta M, Cuquerella J, Martinez-Javaga J. Determination of a color index for citrus fruit degreening. Paper presented at: Proceedings of the International Society of Citriculture 1981; 1981 Nov 9–12; Tokyo, Japan.
17

Cubero S, Albert F, Prats-Moltalbán JM, Fernández-Pacheco DG, Blasco J, Aleixos N. Application for the estimation of the standard citrus colour index (cci) using image processing in mobile devices. Biosyst Eng. 2018;167:63.

18

Itakura K, Saito Y, Suzuki T, Kondo N, Hosoi F. Estimation of citrus maturity with fluorescence spectroscopy using deep learning. Horticulturae. 2018;5:2.

19

Chen S, Xiong J, Jiao J, Xie Z, Huo Z, Hu W. Citrus fruits maturity detection in natural environments based on convolutional neural networks and visual saliency map. Precis Agric. 2022;1–17.

20

Gupta AK, Das S, Sahu PP, Mishra P. Design and development of ide sensor for naringin quantification in pomelo juice: An indicator of citrus maturity. Food Chem. 2022;377:Article 131947.

21
Wen C, Zhang H, Li H, Li H, Chen J, Guo H, Cheng S. Multi-scene citrus detection based on multi-task deep learning network. Paper presented at: SMC 2020. Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics; 2020 Oct 11–14; Toronto, Canada.
22

Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y. Generative adversarial networks. Commun ACM. 2020;63(11):139–114.

23
Mirza M, Osindero S. Conditional generative adversarial nets. arXiv. 2014. https://doi.org/10.48550/arXiv.1411.1784
24
Isola P, Zhu J-Y, Zhou T, Efros AA. Image-to-Image Translation with conditional adversarial networks. Paper presented at: CVPR 2017. Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition; 2017 Jul 21–26; Honolulu, HI.
25
Zhu J-Y, Park T, Isola P, Efros AA. Unpaired image-to-image translation using cycle-consistent adversarial networks. Paper presented at: ICCV 2017. Proceedings of the 2017 IEEE International Conference on Computer Vision; 2017 Oct 22–29; Venice, Italy.
26

Gatys L, Ecker A, Bethge M. A neural algorithm of artistic style. J Vis. 2016;16(12):326.

27
Johnson J, Alahi A, Fei-Fei L. Perceptual losses for real-time style transfer and super resolution. Paper presented at: ECCV 2016. Proceedings of the 14th European Conference on Computer Vision; 2016 Oct 11–14; Amsterdam, The Netherlands.
28

Zhao H-H, Rosin PL, Lai Y-K, Wang Y-N. Automatic semantic style transfer using deep convolutional neural networks and soft masks. Vis Comput. 2020;36:1307–1324.

29
Zhao T, Yan Y, Peng J, Wang H, Fu X. Mask-guided style transfer network for purifying real images. Paper presented at: ICMEW 2019. Proceedings of the 2019 IEEE International Conference on Multimedia and Expo Workshops; 2019 Jul 8–12; Shanghai, China.
30
Sibanda BK, Iyawa GE, Gamundani AM. Mobile apps utilising ai for plant disease identification: A systematic review of user reviews. Paper presented at: IMITEC 2021. Proceedings of the 3rd International Multidisciplinary Information Technology and Engineering Conference; 2021 Nov 23–25; Windhoek, Namibia.
31

Zhang X, Xun Y, Chen Y. Automated identification of citrus diseases in orchards using deep learning. Biosyst Eng. 2022;223:249.

32

Aftab S, Lal C, Kumar S, Fatima A. Raspberry pi (python ai) for plant disease detection. Intl J Curr Res Rev. 2022;14:36.

33

Chen J, Li Q, Tan Q, Gui S, Wang X, Yi F, Jiang D, Zhou J. Combining lightweight wheat spikes detecting model and offline android software development for in-field wheat yield prediction. Trans Chin Soc Agr Engrg (Trans CSAE). 2021;37(19):156–164.

34
Ronneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. Paper presented at: MICCAI 2015. Proceedings of the 18th International Conference on Medical Image Computing and Computer-Assisted Intervention; 2015 Oct 5–9; Munich, Germany.
35
Milletari F, Navab N, Ahmadi S-A. V-net: Fully convolutional neural networks for volumetric medical image segmentation. Paper presented at: 3DV 2016. Proceedings of the Fourth International Conference on 3D Vision; 2016 Oct 25–28; Standford, CA.
36
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Paper presented at: CVPR 2016. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition; 2016 Jun 27–30; Las Vegas, NV.
37
Ulyanov D, Lebedev V, Lempitsky V. Texture networks: Feed-forward synthesis of textures and stylized images. Paper presented at: PMLR 2016. Proceedings of the 33rd International Conference on Machine Learning; 2016 Jun 20–22; New York, NY.
38

Odena A, Dumoulin V, Olah C. Deconvolution and checkerboard artifacts. Distill. 2016;1:e3.

39
Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. Paper prestented at: ICLR 2015. Proceedings of the 3rd International Conference on Learning Representations; 2015 May 7–9; San Diego, CA.
40

Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, et al. Imagenet large scale visual recognition challenge. Int J Comput Vis. 2015;115:211.

41

Everingham M, Eslami SMA, Van Gool L, Williams CKI, Winn J, Zisserman A. The pascal visual object classes challenge: A retrospective. Int J Comput Vis. 2015;111:98–136.

42

Wang Z, Bovik A, Sheikh H, Simoncelli E. Image quality assessment: From error visibility to structural similarity. IEEE Trans Image Process. 2004;13(4):600–612.

43
Deng H, Han C, Cai H, Han G, He S. Spatially-invariant style-codes controlled makeup transfer. Paper presented at: CVPR 2021. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2021 Jun 19–25; virtual.
Plant Phenomics
Article number: 0057
Cite this article:
Bao Z, Li W, Chen J, et al. Predicting and Visualizing Citrus Color Transformation Using a Deep Mask-Guided Generative Network. Plant Phenomics, 2023, 5: 0057. https://doi.org/10.34133/plantphenomics.0057

169

Views

3

Crossref

3

Web of Science

3

Scopus

0

CSCD

Altmetrics

Received: 21 November 2022
Accepted: 19 May 2023
Published: 07 June 2023
© 2023 Zehan Bao et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return