Project Details
Projekt Print View

Multi-Sensor Crop Monitoring for Cacao Production (SeMoCa)

Subject Area Electronic Semiconductors, Components and Circuits, Integrated Systems, Sensor Technology, Theoretical Electrical Engineering
Term from 2019 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 420546347
 
Final Report Year 2023

Final Report Abstract

A growing world population and climatic changes are forcing the agricultural sector into technological advances to maintain a reliable food supply. As a key component, the automation of agricultural processes heavily relies on the availability of sensor data. Many sensing technologies are in use for monitoring agricultural processes today. However, they are limited to their respective measurement quantities such that a combination of various sensors is necessary to gain a broad picture of the monitored environment. SeMoCa addressed this requirement by a complementary sensor fusion approach for a camera and a radar sensor to monitor fruits before harvest. The project focused on the topics: 1) setup of a multisensor system and low-level sensor data fusion, 2) world modeling including semantic information, and 3) high-level information inference and interpretation. An ultra-wideband (UWB) radar and a passive binocular camera were investigated as the main sensors. UWB radars provide a high range resolution and additionally the ability to look through certain materials such as leafage or fruit peel. Cameras provide a high spatial resolution. Furthermore, binocular cameras can generate correctly scaled three-dimensional snapshots which can be used for mapping unknown environments. Hence, both sensing technologies can complement each other for fruit monitoring. The fusion of both sensors starts with data association. Therefore, we investigated an alignment of coordinate systems and sensor-specific accuracies, e.g., regarding the angular resolution. We targeted a feature-based fusion approach with a priori feature extraction at the sensor level for the actual fruit monitoring task. The camera sensor can resolve the specific fruit shape and texture. This has been exploited by a pixel-wise segmentation approach based on a Convolutional Neural Network (CNN). The radar sensor can resolve the wideband scattering properties of the fruits. Therefore, we developed a model-based approach to extract these properties from the data. The world modeling by SLAM algorithms is challenged by complex environmental conditions. We increased the robustness of state-of-the-art solutions by integrating a semantic input (e.g. generated by a CNN) into a SLAM pipeline. Finally, agricultural environments are challenging to simulate, e.g., due to natural variations of crop growth or the influence of the weather. Hence, we performed measurements with a prototype multi-sensor platform on a watermelon field. This measurement campaign provides a comprehensive database for further investigations. Additionally, we performed scattering measurements of harvested fruits in the laboratory. To summarize, we could show that camera and UWB radar sensors can complement each other for agricultural tasks. However, due to the short project duration and necessary preparatory tasks, the detailed analysis of fruit state estimation needs to be continued in future research.

Publications

 
 

Additional Information

Textvergrößerung und Kontrastanpassung