Project Details
learnINg versaTile lEgged locomotioN wiTh actIve perceptiON
Applicant
Professor Jan Reinhard Peters, Ph.D.
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Term
since 2022
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 506123304
Mobile robotics is expected to have a rapid development in the following years. The design of efficient mobile robotics platforms will open various applications in different scenarios and could substitute humans in laborious, repetitive, or dangerous tasks. Among all mobile robotics platforms, legged robots are of particular interest. Their structure allows them to navigate challenging areas differently from wheeled platforms that require flat ground, e.g., when the ground is irregular, in the presence of debris, or in tight spaces, and carry heavy payloads, differently from other bioinspired approaches. The objective of this project is to develop the fundamental methodology to design highly autonomous legged platforms. The desired solution should adapt to demanding environments with low clearance and rough terrain, enabling the robot to avoid actively faulty states and recover autonomously. To significantly improve the autonomy of the gait controllers, we need to have a deeper look at some fundamental aspects of the interaction of the environment: perception and action. We will exploit modern machine learning techniques to create a more general framework's theoretical and practical foundations. The key novelty of our approach lies in the joint analysis of learning for perception and action. Indeed, there are many important reasons to consider action and perception jointly. Researchers demonstrated that active and interactive perception is fundamental to improving the ability to perceive and estimate quantities that cannot be measured by passively sensing the environment, particularly in manipulation tasks. Furthermore, to build strong algorithmic foundations in perception and locomotion learning, it is crucial to consider what type of input and output data is produced by each module, how the data is processed, and which kind of representation is more suitable for the learning process. This integrated solution will enable our system to be aware of the surrounding environment, react to unexpected terrain properties avoiding slippage and failures, sense contact with the surrounding obstacles, and exploit this physical interaction to locomote efficiently through narrow passages, going beyond current state-of-the-art methods. The final objective of our research is to verify, employing high-quality scientific evaluation standards and benchmarks, the methodology. To better highlight the developed solution's strengths and eventual drawbacks, we will test the proposed system in a cave exploration scenario. Our objective is to demonstrate the solution's effectiveness on a task that goes beyond existing locomotion test scenarios and provide an easy way to compare our platform with different commercially available hardware and software solutions.
DFG Programme
Research Grants
International Connection
Poland
Cooperation Partner
Professor Dr. Krzysztof Tadeusz Walas