A digital framework for automated non-invasive waterfowl detection in Carinthia based on high resolution UAS imagery and machine learning

Authors

  • Gernot Paulus Carinthia University of Applied Sciences image/svg+xml Author
  • Ulf Scherling Carinthia University of Applied Sciences image/svg+xml Author
  • Mohammed Sa’Doun Carinthia University of Applied Sciences image/svg+xml Author
  • Karl-Heinrich Anders Carinthia University of Applied Sciences image/svg+xml Author
  • Werner Petutschnig Author
  • Johann Wagner Carinthia University of Applied Sciences image/svg+xml Author
  • Christopher Lippitt Carinthia University of Applied Sciences image/svg+xml Author

DOI:

https://doi.org/10.71911/cii-p3-nt-2024112

Keywords:

waterfowl surveying, uncrewed aerial systems, deep learning, YOLO, expert annotation, digital geotransformation

Abstract

Automated waterfowl detection from uncrewed aerial system (UAS; “drones”) imagery has become an important task for various environmental applications such as wildlife monitoring, nature conservation, and habitat mapping. This paper presents a digital framework for automated waterfowl detection using high-resolution UAS imagery and artificial intelligence/machine learning (ML). Several UAS missions in Brenndorf, Carinthia, Austria, were conducted simultaneously with a traditional ground-based waterfowl field survey by an experienced expert. Several data pre-processing steps were applied to optimize digital image data pipelines for the generation of high-quality ML training data. The You Only Look Once (YOLO) open-source computer vision and ML object detection model was used to detect waterfowl in the UAS imagery. A transfer learning approach from a large waterfowl study at the University of New Mexico in collaboration with the U.S. Fish and Wildlife Service was used to further improve the model’s performance. Validation results showed promising performance with 80% and 83% classification accuracy on the waterfowl classes ‘duck’ and ‘swan’, respectively. Finally, a spatial projection model and a visualization approach for the ML-based detection and classification results on a map were implemented. The proposed digital framework for automated waterfowl detection provides promising results for standardization and a new paradigm for waterfowl counting to support and extend traditional wildlife monitoring.

Author Biographies

  • Gernot Paulus, Carinthia University of Applied Sciences

    Spatial Informatics for ENvironmental Applications (SIENA)

  • Ulf Scherling, Carinthia University of Applied Sciences

    Spatial Informatics for ENvironmental Applications (SIENA) 

  • Mohammed Sa’Doun, Carinthia University of Applied Sciences

    Spatial Informatics for ENvironmental Applications (SIENA) 

  • Karl-Heinrich Anders, Carinthia University of Applied Sciences

    Spatial Informatics for ENvironmental Applications (SIENA) 

  • Werner Petutschnig

    Office of the Carinthian Provincial Government, Dept. 8 - Environment, Energy and Nature Conservation, Klagenfurt 

  • Johann Wagner, Carinthia University of Applied Sciences

    Office of the Carinthian Provincial Government, Dept. 8 - Environment, Energy and Nature Conservation, Klagenfur

  • Christopher Lippitt, Carinthia University of Applied Sciences

    CMS-RS, University of New Mexico Department of Geography and Environmental Studies, Albuquerque NM, 87131 

References

S. J. Dundas, M. Vardanega, P. O’Brien, and S. R. McLeod, “Quantifying waterfowl numbers: Comparison of drone and ground-based survey methods for surveying waterfowl on artificial waterbodies,” Drones, vol. 5, no. 1, p. 5, Jan. 2021. doi: 10.3390/drones5010005

S. Wagner and W. Petutschnig, “Wasservogelzählung in Kärnten 2021,” Carinthia II, vol. 211_131_1, pp. 177–186, 2021.

IWC, “About the International Waterbird Census,” Wetlands.org. Accessed: Aug. 26, 2024. [Online]. Available: https://iwc.wetlands.org/index.php/aboutiwc

D. Wen, L. Su, Y. Hu, Z. Xiong, M. Liu, and Y. Long, “Surveys of large waterfowl and their habitats using an unmanned aerial vehicle: A case study on the Siberian crane,” Drones, vol. 5, no. 4, p. 102, Sep. 2021. doi: 10.3390/drones5040102

S.-J. Hong, Y. Han, S.-Y. Kim, A.-Y. Lee, and G. Kim, “Application of deep-learning methods to bird detection using unmanned aerial vehicle imagery,” Sensors, vol. 19, no. 7, p. 1651, Apr. 2019. doi: 10.3390/s19071651

Z. Tang et al., “sUAS and machine learning integration in waterfowl population surveys,” in Proc. 33rd IEEE Int. Conf. Tools with Artif. Intell. (ICTAI), Nov. 2021. doi: 10.1109/ICTAI52525.2021.00084

J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2016, pp. 779–788. doi: 10.1109/CVPR.2016.91

R. L. Converse, C. D. Lippitt, S. E. Sesnie, G. M. Butler, and D. R. Stewart, “Observer variability in manual-visual interpretation of UAS imagery of wildlife, with insights for deep learning applications,” [Unpublished].

R. L. Converse, “Drones for ducks,” Zooniverse. Accessed: Aug. 26, 2024. [Online]. Available: https://www.zooniverse.org/projects/rowan-aspire/drones-for-ducks

European Union, Commission Delegated Regulation (EU) 2019/945 of 12 March 2019 on unmanned aircraft systems and on third-country operators of unmanned aircraft systems. Accessed: Aug. 26, 2024. [Online]. Available: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A02019R0945-20200809

European Union, Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the rules and procedures for the operation of unmanned aircraft. Accessed: Aug. 26, 2024. [Online]. Available: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A02019R0947-20220404

Rechtsinformationssystem des Bundes (RIS), “Kundmachung der Landesregierung vom 26. März 2019, Zl. 01-VD-LG-1883/1-2019, über die Wiederverlautbarung des Kärntner Nationalpark- und Biosphärenparkgesetzes,” RIS. Accessed: Aug. 26, 2024. [Online]. Available: https://www.ris.bka.gv.at/GeltendeFassung.wxe?Abfrage=LrK&Gesetzesnummer=20000339&FassungVom=2024-08-18

M. M. Sa’doun, C. D. Lippitt, G. Paulus, and K. Anders, “A comparison of convolutional neural network architectures for automated detection and identification of waterfowl in complex environments,” GI_Forum, vol. 1, pp. 152–166, 2021. doi: 10.1553/giscience2021_02_s152

U. Nepal and H. Eslamiat, “Comparing YOLOv3, YOLOv4 and YOLOv5 for autonomous landing spot detection in faulty UAVs,” Sensors, vol. 22, no. 2, p. 464, Jan. 2022. doi: 10.3390/s22020464

R. L. Converse, C. D. Lippitt, S. E. Sesnie, G. M. Butler, and D. R. Stewart, “Optimizing deep learning training data for automating aerial wildlife monitoring,” [Unpublished].

K. Weiss, T. M. Khoshgoftaar, and D. Wang, “A survey of transfer learning,” J. Big Data, vol. 3, no. 1, p. 9, May 2016. doi: 10.1186/s40537-016-0043-6

G. Wang, Y. Chen, P. An, H. Hong, J. Hu, and T. Huang, “UAV-YOLOv8: A small-object-detection model based on improved YOLOv8 for UAV aerial photography scenarios,” Sensors, vol. 23, no. 16, Art. no. 7190, Aug. 2023. doi: 10.3390/s23167190

J. Van den Bossche, “geopandas/geopandas: v1.0.1,” Zenodo, Jul. 2, 2024. doi: 10.5281/zenodo.12625316

C. D. Lippitt, D. A. Stow, and K. C. Clarke, “On the nature of models for time-sensitive remote sensing,” Int. J. Remote Sens., vol. 35, no. 18, pp. 6815–6841, Sep. 2014. doi: 10.1080/01431161.2014.965287

Published

01-01-2024

How to Cite

A digital framework for automated non-invasive waterfowl detection in Carinthia based on high resolution UAS imagery and machine learning. (2024). Carinthia II Part 3 - Carinthia Nature Tech, 1(1), 19. https://doi.org/10.71911/cii-p3-nt-2024112