Project Details
Projekt Print View

IP5: Uncertainty meets explainability -- Combining Uncertainty Quantification and Explainable Machine Learning for Crop Monitoring

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term since 2022
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 459376902
 
One of the preeminent goals of applying machine learning is prediction, meaning, mapping input data to a known output. Usually a predictive model is trained with regard to high accuracy, but for decision making processes there is also a high demand for understanding the way a specific model operates and how uncertain the decision is, for example, to verify the reliability of the result. Recently, explainable ML has emerged as a way to gain a stronger understanding into the workings of deep neural networks. These techniques combine domain knowledge with interpretation tools which map complex processes (e.g. a decision process in a neural network) into a human-understandable space. Equally important to this task is the quantification of the uncertainty of the outcome. Although neural networks can also provide a confidence score for the prediction, they tend to be overconfident. If these confidences are used as a basis for decision making, this can lead to incorrect or unreliable outcomes. So far, the fields of explainable machine learning and uncertainty quantification have rarely been considered together for decision making. While explainable machine learning - especially sensitivity analysis - considers the reasons for the decision and the importance of uncertainties, uncertainty quantification aims at an assessment of the reliability of the decision, thus both directions are complementary. We argue that neither interpretation tools from explainable ML nor the confidence of neural networks alone are sufficient to make a comprehensive statement about the reliability of an outcome. Therefore, this project will explore the novel combination of uncertainty quantification and explainable machine learning for decision making in horticulture and provide input variables to help for an efficient sensing and for the definition of tactical management decisions.
DFG Programme Research Units
 
 

Additional Information

Textvergrößerung und Kontrastanpassung