Project Details
Projekt Print View

Multimodal AI-based pain measurement in intermediate care patients in the postoperative period

Subject Area Anaesthesiology
Term since 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 527765259
 
The aim of the project is the automated and continuous measurement of pain intensity in patients in postoperative inpatient care using multimodal sensors and artificial intelligence methods, which will be further developed and validated for clinical use. This results in a novel system for real-time pain monitoring, which relieves medical staff from routine tasks and can alert experts if necessary. The project is divided into the following phases: "Testing", Phase I+II: "Translation & Data Collection" and Phase III: "Validation". Testing: The multimodal measurement infrastructure will be developed in a laboratory environment. For this purpose, the necessary infrared camera technology (for capturing facial expressions and gestures), biosignal amplifiers and data synchronization and recording technology will be installed and a laboratory room comparable to a room in the intensive care unit of the University Hospital Ulm will be set up, i.e. with an identical intensive care bed, the same arrangement of the bed with respect to the wall, ceiling lights and/or medical technology in the form of "dummies", etc. Phase I: The measurement infrastructure is integrated into a patient room of the Interdisciplinary Operative Intensive Care Medicine of the University Hospital Ulm. Subsequently, the multimodal data collection with 84 all-round oriented patients takes place - the "gold standard" (AI ground truth) here is the subjective pain perception as well as external observation, whereby in each case a baseline measurement is carried out before the planned operation as well as a continuous measurement over 48 hours after the operation - with recording of the pain in the course of time including the dosage of the medicinal pain therapy. In parallel, algorithms for the multimodal detection of pain, which have been developed in the laboratory, are being adapted to the new requirements. This includes the application of Deep Transfer Learning to adapt computer vision procedures from color to NIR video, adaptation of procedures for real-time data analysis, and transfer learning to adapt pain detection models from our preliminary work to clinical conditions, requirements, and pain modalities. Deep learning enables learning of feature extraction optimized for the application and has great potential when combined with transfer learning. Phase II: In a further sample of not omnidirectional patients (N=42) multimodal data will be collected and analyzed as well, the "gold standard" here is "only" third-party observation. Phase III: A demonstrator will be developed and validated with a further group of patients (N=10) who are both omnidirectionally and non-omnidirectionally oriented. An algorithm will detect pain in real time. An acceptance survey of the developed demonstrator by medical staff will also be conducted. Results will be published in journals and presented at conferences.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung