AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Channel Attention GAN-Based Synthetic Weed Generation for Precise Weed Identification

Tang Li1Motoaki Asai2Yoichiro Kato1Yuya Fukano3Wei Guo1( )
Graduate School of Agricultural and Life Sciences, The University of Tokyo, Tokyo 188-0002, Japan
Institute for Plant Protection, National Agriculture and Food Research Organization, Fukushima 960-2156, Japan
Graduate School of Horticulture, Chiba University, Chiba 271-0092, Japan
Show Author Information

Abstract

Weed is a major biological factor causing declines in crop yield. However, widespread herbicide application and indiscriminate weeding with soil disturbance are of great concern because of their environmental impacts. Site-specific weed management (SSWM) refers to a weed management strategy for digital agriculture that results in low energy loss. Deep learning is crucial for developing SSWM, as it distinguishes crops from weeds and identifies weed species. However, this technique requires substantial annotated data, which necessitates expertise in weed science and agronomy. In this study, we present a channel attention mechanism-driven generative adversarial network (CA-GAN) that can generate realistic synthetic weed data. The performance of the model was evaluated using two datasets: the public segmented Plant Seedling Dataset (sPSD), featuring nine common broadleaf weeds from arable land, and the Institute for Sustainable Agro-ecosystem Services (ISAS) dataset, which includes five common summer weeds in Japan. Consequently, the synthetic dataset generated by the proposed CA-GAN obtained an 82.63% recognition accuracy on the sPSD and 93.46% on the ISAS dataset. The Fréchet inception distance (FID) score test measures the similarity between a synthetic and real dataset, and it has been shown to correlate well with human judgments of the quality of synthetic samples. The synthetic dataset achieved a low FID score (20.95 on the sPSD and 24.31 on the ISAS dataset). Overall, the experimental results demonstrated that the proposed method outperformed previous state-of-the-art GAN models in terms of image quality, diversity, and discriminability, making it a promising approach for synthetic agricultural data generation.

References

1
Sylvester G. E-agriculture in action: Drones for agriculture. Bangkok, Thailand: Food and Agriculture Organization of the United Nations and International Telecommunication Union; 2018.
2
Zimdahl RL. Influence of competition on the plant. In: Weed-crop competition. John Wiley & Sons Ltd.; 2004. p. 19–26.
3

Adeux G, Vieren E, Carlesi S, Bàrberi P, Munier-Jolain N, Cordeau S. Mitigating crop yield losses through weed diversity. Nat Sustain. 2019;2(11):1018–1026.

4

Brown RB, Noble SD. Site-specific weed management: Sensing requirements—What do we need to see? Weed Sci. 2005;53(2):252–258.

5

Tsaftaris SA, Minervini M, Scharr H. Machine learning for plant phenotyping needs image processing. Trends Plant Sci. 2016;21(12):989–991.

6
Skovsen S, Dyrmann M, Eriksen J, Gislum R, Jørgensen RN. Predicting dry matter composition of grass clover leys using data simulation and camera-based segmentation of field canopies into white clover, red clover, grass and weeds. Paper presented at: Proceedings of the 14th International Conference on Precision Agriculture; 2018 Jun 24–27; Montreal, Quebec, Canada.
7

Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y. Generative adversarial nets. Adv Neural Inf Process Syst. 2014;27:2672–2680.

8
Frid-Adar M, Klang E, Amitai M, Goldberger J, Greenspan H. Synthetic data augmentation using GAN for improved liver lesion classification. Paper presented at: 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI); 2018 Apr 04–07; Washington, DC, USA.
9
Mirza M, Osindero S. Conditional generative adversarial nets. ArXiv. 2014. http://arxiv.org/abs/1411.1784.
10
Odena A, Olah C, Shlens J. Conditional image synthesis with auxiliary classifier GANs. ArXiv. 2017. http://arxiv.org/abs/1610.09585.
11
Miyato T, Koyama M. cGANs with projection discriminator. ArXiv. 2018. http://arxiv.org/abs/1802.05637.
12
Giuffrida MV, Scharr H, Tsaftaris SA. ARIGAN: Synthetic Arabidopsis plants using generative adversarial network. Paper presented at: 2017 IEEE International Conference on Computer Vision Workshops (ICCVW); 2017 Oct 22–29; Venice, Italy.
13
Zhu Y, Aoun M, Krijn M, Vanschoren J. Data augmentation using conditional generative adversarial networks for leaf counting in arabidopsis plants. Paper presented at: British Machine Vision Conference: Workshop on Computer Vision Problems in Plant Phenotyping (BMVC); 2018 Sep 06; Newcastle, UK.
14

Madsen SL, Dyrmann M, Jørgensen RN, Karstoft H. Generating artificial images of plant seedlings using generative adversarial networks. Biosyst Eng. 2019;187:147–159.

15
Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville A. Improved training of Wasserstein GANs. ArXiv. 2017. http://arxiv.org/abs/1704.00028.
16

Madsen SL, Mortensen AK, Jørgensen RN, Karstoft H. Disentangling information in artificial images of plant seedlings using semi-supervised GAN. Remote Sens. 2019;11(22):2671.

17

Espejo-Garcia B, Mylonas N, Athanasakos L, Vali E, Fountas S. Combining generative adversarial networks and agricultural transfer learning for weeds identification. Biosyst Eng. 2021;204:79–89.

18
Giselsson TM, Jørgensen RN, Jensen PK, Dyrmann M, Midtiby HS. A public image database for benchmark of plant seedling classification algorithms. ArXiv. 2017. http://arxiv.org/abs/1711.05458.
19

Guo W, Zheng B, Duan T, Fukatsu T, Chapman S, Ninomiya S. EasyPCC: Benchmark datasets and tools for high-throughput measurement of the plant canopy coverage ratio under field conditions. Sensors. 2017;17(4):798.

20
Zhang H, Goodfellow I, Metaxas D, Odena A. Self-attention generative adversarial networks. ArXiv. 2019. http://arxiv.org/abs/1805.08318.
21
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I. Attention is all you need. ArXiv. 2017. http://arxiv.org/abs/1706.03762.
22
Hu J, Shen L, Albanie S, Sun G, Wu E. Squeeze-and-excitation networks. ArXiv. 2019. http://arxiv.org/abs/1709.01507.
23
Miyato T, Kataoka T, Koyama M, Yoshida Y. Spectral normalization for generative adversarial networks. ArXiv. 2018. http://arxiv.org/abs/1802.05957.
24
Lim JM, Ye JC. Geometric GAN. ArXiv. 2017. http://arxiv.org/abs/1705.02894.
25
Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. ArXiv. 2015. http://arxiv.org/abs/1502.03167.
26
Karras T, Laine S, Aila T. A style-based generator architecture for generative adversarial networks. ArXiv. 2019. http://arxiv.org/abs/1812.04948.
27
Brock A, Donahue J, Simonyan K. Large scale GAN training for high fidelity natural image synthesis. ArXiv. 2019. http://arxiv.org/abs/1809.11096.
28

Sun C, Huang C, Zhang H, Chen B, An F, Wang L, Yun T. Individual tree crown segmentation and crown width extraction from a Heightmap derived from aerial laser scanning data using a deep learning framework. Front Plant Sci. 2022;13:Article 914974.

29
Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. 2010; p. 249–256.
Plant Phenomics
Article number: 0122
Cite this article:
Li T, Asai M, Kato Y, et al. Channel Attention GAN-Based Synthetic Weed Generation for Precise Weed Identification. Plant Phenomics, 2024, 6: 0122. https://doi.org/10.34133/plantphenomics.0122

200

Views

1

Crossref

3

Web of Science

3

Scopus

0

CSCD

Altmetrics

Received: 09 November 2023
Accepted: 18 February 2024
Published: 28 March 2024
© 2024 Tang Li et al. Exclusive licensee Nanjing Agricultural University. No claim to original U.S. Government Works.

Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0).

Return