Detailseite
Projekt Druckansicht

BigPlantSens - Untersuchung der Synergien von Big Data und Deep Learning für die Fernerkundung von Pflanzenarten

Antragsteller Dr. Teja Kattenborn
Fachliche Zuordnung Ökologie und Biodiversität der Pflanzen und Ökosysteme
Forstwissenschaften
Geodäsie, Photogrammetrie, Fernerkundung, Geoinformatik, Kartographie
Physische Geographie
Förderung Förderung von 2020 bis 2024
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 444524904
 
Erstellungsjahr 2025

Zusammenfassung der Projektergebnisse

Accurate information on the geographic distribution of plant species is crucial for applications in research, nature conservation, forestry, agriculture, and ecosystem service assessments. Recent advancements in Uncrewed Aerial Vehicles (UAVs) have greatly enhanced our capabilities to provide information in high spatial and temporal detail on plant species distributions. In concert with deep learning-based pattern recognition, various plant species can be identified accurately from UAV aerial imagery. However, a key challenge is the requirement for large sets of reference observations, that is plant images and species information, for effective training of deep learning models. A solution to this problem might be plant photographs derived from species identification apps. Initiatives such as iNaturalist or Pl@ntNet provide millions of citizen science photographs of thousands of plant species. These crowd-sourced datasets provide an unprecedented portfolio of the appearance of plant species. In the BigPlantSens project, we hence use this citizen science data to train pattern recognition models for identifying plant species in UAV imagery. Key challenges of the citizen science data are that they only provide information that a species is present in a plant image but not where in the image. We hence developed several approaches to use the so-called ‘weak’ information of citizen science plant photographs for semantic segmentation of plant species in UAV imagery (pixel-wise classifications). We showed that this way a range of plant species can be accurately identified from models that were trained on citizen science data alone. A critical factor in this regard is the spatial resolution of the UAV imagery, as this limits how much characteristic features of a plant species are visible. We identified that a strength of citizen science data is that it is highly variable, facilitating the training of models that are transferable across illumination conditions or scene components. We found that certain characteristics of training images, such as the image distance or viewing angle can negatively affect the mapping accuracy. Consequently, we developed approaches to filter image datasets based on these characteristics before training. In BigPlantSens, we successfully explored the synergies between big data on plant photographs, drone imagery, and pattern recognition for fully automated vegetation mapping. Such fully automated vegetation mapping could pave the way for refined approaches in biodiversity monitoring and ecosystem management.

Projektbezogene Publikationen (Auswahl)

 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung