Project Details
Projekt Print View

Gaze and motor control – Patterns of Coordination

Applicant Dr. Jolande Fooken
Subject Area General, Cognitive and Mathematical Psychology
Human Cognitive and Systems Neuroscience
Cognitive, Systems and Behavioural Neurobiology
Term from 2020 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 441960462
 
Final Report Year 2024

Final Report Abstract

Real-world actions, such as cooking or eating dinner, involve the continuous coordination of perceptual and sensorimotor processes. When looking and acting in the natural environment, humans move their eyes to gather sensory information of task-relevant objects and to support ongoing motor control. The project Gaze and Motor Control – Patterns of Coordination presents a series of experiments that examined eye-hand coordination in situations, in which highacuity (i.e., central) vision was needed at competing locations. In the first study, participants used either their fingertips or a tool (i.e., tweezers) to grasp and drop a ball into a slot, while simultaneously monitoring a text display. When grasping with fingertips, participants primarily looked at the display and relied on tactile—and not visual— feedback to perform the action task. When using the tool, participants used central vision to guide grasping and inserting the ball into the slot. Overall, the decision whether and when to look at the action task was related to the temporal regularities of the perception task. Specifically, participants looked away from the display at times at which visual events were unlikely to occur, indicating that decisions about gaze allocation were sensitive to the statistics of the environment. In the second study, participants used both hands to continuously perform two separate object grasp and drop tasks. I found that intermanual coordination, i.e., the synchronization between hands, was variable but systematically related to gaze allocation. When participants first looked at both objects during grasping, before looking at both goals during dropping, the two hands moved together but with a small temporal offset to allow the use of central vision at subsequent grasp and drop locations. When participants looked at one object grasp and drop before looking at the other object grasp and drop, the two hands moved with a large temporal offset, or asynchronously. These results reveal that intermanual coordination is highly flexible and shaped by the demands for gaze. In the third study, participants intercepted multiple objects with a virtual paddle. Here, participants briefly tracked each object—rather than several objects at once—before they intercepted it. The time an object was tracked depended on paddle size, with participants tracking objects longer when using the small paddle. When some objects had to be avoided, participants tracked ‘avoid targets’ less frequently and shorter than ‘hit targets’, suggesting that gaze resources were used efficiently. In sum, my results highlight that gaze supports motor control by providing central vision of action-relevant locations and guiding the effector when contacting the environment. In turn, hand movements are modulated such that gaze can arrive at action-relevant locations at critical times. Thus, patterns of eye-hand coordination dynamically adapt with varying visuomotor demands and environmental structure.

Link to the final report

https://doi.org/10.17605/OSF.IO/DXNG2

Publications

 
 

Additional Information

Textvergrößerung und Kontrastanpassung