Navigation auf uzh.ch
Investigating the relationship between morphology, intrinsic body dynamics, information structure generation through coordinated sensory-motor activities and learning.
In collaboration with the Developmental Cognitive Machines Lab. at the University of Tokyo (Prof. Hiroshi Yokoi), we have developed a prosthetic robotic hand inspired by the muscle tendon system of the human hand. The robotic hand has 13 degrees of freedom, and each finger has been equipped with different types of sensors (i.e., flex/bend, angle, and pressure).
In this project, we use the robotic hand to investigate the relationship between morphology, intrinsic body dynamics, generation of information structure through sensorimotor coordinated activity, and learning. We have implemented biologically inspired learning mechanism to allow the robotic hand to explore its own movement capabilities. Moreover, by correlating the sensory input as a result of its motor outputs, the robotic hand can learn to manipulate and grasp objects by itself.
The same robotic hand has been used as a prosthetic device. EMG signals can be used to interface the robot hand non-invasively to a patient and electrical stimulation can be used as a substitute for tactile feedback.
The goal of this project is the development of a prosthetic hand for motor-sensory function substitution which is dynamically coupled to an amputee’s sensor and motor control system. The hand will be based on EMG signals and various types of sensory feedback. By carefully investigating human upper limb dynamics and by taking into account morphological and material properties of assistive devices we hope to develop a scheme by which patients quickly learn to control the hand with less and less cognitive awareness by the user.
The main researchers involved in the project are: Prof. Rolf Pfeifer, Prof. Wenwei Yu, Prof. Robert Riener, Dr. Gabriel Gomez, Dr. Alejandro Hernandez, Konstantinos Dermitzakis and Dana Damian.