Deep-learning based population monitoring of the endangered plant species Gladiolus illyricus: lessons learned for implementation of a technology-based biodiversity monitoring approach
DOI:
https://doi.org/10.71911/cii-p3-nt-2025211Keywords:
Deep-learning, biodiversity-monitoring, Gladiolus illyricus, flower detection, UASAbstract
New technologies offer promising possibilities in biodiversity monitoring to increase standardization of sampling methods and improve cost efficiency. Among the former, uncrewed aerial systems (UAS) are widely used today to produce orthomosaics of a particular area. At the same time, computer-intensive methods for automated object detection within images have increased accordingly. While they are widely used in science, applied nature conservation makes little use of these methods. The current study aimed to test the applicability of UAS in combination with a deep-learning based object detection workflow in Schütt-Graschelitzen, a small-scale Natura 2000 protected area near Villach, Austria. For this purpose, we trained a YOLO_v8 algorithm with flowers of Gladiolus illyricus from an orthomosaic. The orthomosaic was split into about 1000 equally sized tiles with 80 tiles used for training and 20 tiles used for validation. For ground truthing, the individual inflorescences were counted manually. Our main findings indicated moderate model performance with the training and validation dataset and also with new data. Moderate – rather than strong – performance is likely a result of too little training data. While object detection worked considerably well, background revealed too high variability, making reliable classifications challenging. Comparing the different work steps (without UAS mission) suggests that creating a representative training dataset is the most time-intensive part of the workflow. For small areas and a single survey, this is likely not efficient compared to traditional field sampling methods. However, its efficiency increases with each resurvey event, as pretrained deep-learning models developed during prior monitoring cycles can be reused. This can reduce the amount of training data required in a subsequent survey. Additionally, UAS- and deep-learning based monitoring can help at sites with high sensitivity to trampling and favors large study areas, as its efficiency increases with the sample size area.
References
[1] J. Bongaarts, “IPBES, 2019. Summary for policymakers of the global assessment report on biodiversity and ecosystem services of the Intergovernmental Science‐Policy Platform on Biodiversity and Ecosystem Services,” Population & Development Rev, vol. 45, no. 3, pp. 680–681, 2019, doi: 10.1111/padr.12283
[2] P. F. Langhammer et al., “The positive impact of conservation action,” Science (New York, N.Y.), vol. 384, no. 6694, pp. 453–458, 2024, doi: 10.1126/science.adj6598
[3] C. J. Mcowen et al., “Sufficiency and Suitability of Global Biodiversity Indicators for Monitoring Progress to 2020 Targets,” Conservation Letters, vol. 9, no. 6, pp. 489–494, 2016, doi: 10.1111/conl.12329
[4] A. E. Magurran et al., “Long-term datasets in biodiversity research and monitoring: assessing change in ecological communities through time,” Trends in ecology & evolution, vol. 25, no. 10, pp. 574–582, 2010, doi: 10.1016/j.tree.2010.06.016
[5] D. A. Driscoll et al., “A biodiversity-crisis hierarchy to evaluate and refine conservation indicators,” Nature ecology & evolution, vol. 2, no. 5, pp. 775–781, 2018, doi: 10.1038/s41559-018-0504-8
[6] M. Walters and R. J. Scholes, Eds., The GEO Handbook on Biodiversity Observation Networks, 1st ed. Cham: Springer International Publishing : Imprint: Springer, 2017
[7] N. R. Hahn, S. P. Bombaci, and G. Wittemyer, “Identifying conservation technology needs, barriers, and opportunities,” Sci Rep, vol. 12, no. 1, p. 4802, 2022, doi: 10.1038/s41598-022-08330-w
[8] A. I. de Castro, Y. Shi, J. M. Maja, and J. M. Peña, “UASs for Vegetation Monitoring: Overview and Recent Scientific Contributions,” Remote Sensing, vol. 13, no. 11, p. 2139, 2021, doi: 10.3390/rs13112139
[9] H. L. Larsen et al., “Drone with mounted thermal infrared cameras for monitoring terrestrial mammals,” Drones, vol. 7, no. 11, p. 680, 2023, doi: 10.3390/drones7110680
[10] J. Wäldchen, M. Rzanny, M. Seeland, and P. Mäder, “Automated plant species identification—Trends and future directions,” PLoS computational biology, vol. 14, no. 4, e1005993, 2018, doi: 10.1371/journal.pcbi.1005993
[11] P. J. Stephenson, “Technological advances in biodiversity monitoring: applicability, opportunities and challenges,” Current Opinion in Environmental Sustainability, vol. 45, pp. 36–41, 2020, doi: 10.1016/j.cosust.2020.08.005
[12] QGIS Development Team, QGIS Geographic Information System: QGIS Association, 2023 [Online]. Available: https://www.qgis.org/
[13] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: unified, real-time object detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 779–788
[14] J. Glenn, C. Ayush, and Q. Jing, Ultralytics YOLOv8, 2023 [Online]. Available: https://github.com/ultralytics/ultralytics
[15] S.-H. Lee, S.-H. Oh, and J.-G. Kim, “YOLOv5-based electric scooter crackdown platform,” Applied Sciences, vol. 15, no. 6, p. 3112, 2025, doi: 10.3390/app15063112
[16] O. Rainio, J. Teuho, and R. Klén, “Evaluation metrics and statistical tests for machine learning,” Sci Rep, vol. 14, no. 1, p. 6086, 2024, doi: 10.1038/s41598-024-56706-x
[17] R Core Team, R: a language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing, 2022 [Online]. Available: https://www.r-project.org/
[18] R. J. Hijmans, raster: Geographic data analysis and modeling: R-package [Online]. Available: https://cran.r-project.org/package=raster
[19] R. J. Hijmans, terra: Spatial Data Analysis, 2024 [Online]. Available: https://cran.r-project.org/package=terra
[20] E. J. Pebesma and R. Bivand, Spatial data science: With applications in R. Boca Raton, London, New York: CRC Press, 2023
[21] E. Pebesma, “Simple features for R: standardized support for spatial vector data,” The R Journal, vol. 10, no. 1, p. 439, 2018, doi: 10.32614/RJ-2018-009
[22] P. Xie, R. Gao, W. Lu, and L. Shen, “Weakly supervised bounding‐box generation for camera‐trap image based animal detection,” IET Computer Vision, 2024, doi: 10.1049/cvi2.12332
[23] S. Christin, É. Hervet, and N. Lecomte, “Applications for deep learning in ecology,” Methods Ecol Evol, vol. 10, no. 10, pp. 1632–1644, 2019, doi: 10.1111/2041-210X.13256
[24] Q. Fan, Y. Li, M. Deveci, K. Zhong, and S. Kadry, “LUD-YOLO: A novel lightweight object detection network for unmanned aerial vehicle,” Information Sciences, vol. 686, p. 121366, 2025, doi: 10.1016/j.ins.2024.121366
[25] J. Qu et al., “SS-YOLOv8: small-size object detection algorithm based on improved YOLOv8 for UAS imagery,” Multimedia Systems, vol. 31, no. 1, 2025, doi: 10.1007/s00530-024-01622-3
[26] C. L. Elzinga, D. W. Salzer, and J. W. Willoughby, “Measuring & Monitoring Plant Populations,” US Bureau of Land Management, Denver, CO, Bureau of Land Management Papers BLM/RS/ST-98/005+1730, 2019. Accessed: Oct. 13 2022 [Online]. Available: http://digitalcommons.unl.edu/usblmpub/41?utm_source=digitalcommons.unl.edu%2Fusblmpub%2F41&utm_medium=PDF&utm_campaign=PDFCoverPages
[27] X. Li, X. Yang, Z. Ma, and J.-H. Xue, “Deep metric learning for few-shot image classification: A Review of recent developments,” Pattern Recognition, vol. 138, p. 109381, 2023, doi: 10.1016/j.patcog.2023.109381
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Carinthia II / Part 3 - Carinthia Nature Tech

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.