Project Details
Projekt Print View

Transfer Learning for Human Activity Recognition in Logistics

Subject Area Human Factors, Ergonomics, Human-Machine Systems
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Production Systems, Operations Management, Quality Management and Factory Planning
Traffic and Transport Systems, Intelligent and Automated Traffic
Term from 2016 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 316862460
 
In the age of the Industry 4.0, manual activities remain dominant in the logistics sector. Detailed information on the occurrence and duration of relevant human activities is crucial for warehousing efficiency and thus the entire supply chain. As manual assessment is economically inexpedient, methods of human activity recognition (HAR) gain relevance. HAR is a classification task for recognizing human movements from time-series that is already used in applications such as smart-homes, rehabilitation and health support. HAR based on non-invasive and highly reliable on-body devices is of special relevance, as these devices extends its potential in challenging scenarios. Training a classifier demands a large amount of data, as human movements are highly variable and diverse, in particular in the diverse environments of the logistics sector.The objective of this project is to develop a method for avoiding the tremendous effort for creating and annotating high quality on-body-devices data for HAR in logistics. Different logistics scenarios will be replicated in a reference field that is equipped with a highly accurate, optical-motion capturing (oMoCap). In this constraint environment, oMoCap and on-body device data will be captured synchronously. The combined oMocap recordings of all scenarios constitute a reference dataset. Methods of transfer- and zero-shot learning will enable to constitute such reference dataset across the scenarios. Methods of machine learning, especially, deep learning will be considered for processing time-series and for training a classifier based on the reference oMoCap-dataset. The classifier will allow for an automated annotation of the synchronized on-body devices data. Furthermore, methods for creating additional synthetic data from raw-oMoCap data will be considered. The performance of the classifier that is trained on the automatically annotated and synthetic on-body device data will be examined by comparing it to manually annotated data from a real warehouse.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung