The purpose of this research is the realization of a system that allows ordinary people, with no particular knowledge on robots, to teach motions to humanoid robots by touching them. The choice of focusing on humanoids is motivated by two main reasons. First, humanoid robots have a morphology similar to the one of people, and are thus expected to be able to use tools and infrastructures already available for people, without the need of building new ones. Second, if opportunely programmed, their actions can be easily understood by people. Indeed, humans are already skilled in predicting other people's behavior. If robots move similarly to humans, then people will be able to unconsciously predict robot's behaviors. This in turn, eases the communication between humans and robots and increases the safety of both of them.
The movements that ordinary people expect from a humanoid robot when they touch it were investigated. From the data collected a mathematical model of these responses was created. The robot was moved according to the model, and it was verified that people find the robot movements natural. Moreover, it was shown that, given several responses to a touch, the model is able to predict which is the most natural one. The model was also shown to be able to classify the responses into several groups, which correspond to the grouping that ordinary people tend to do. The work is currently under review.
In order to study with more depth the expected movements of the head, a new 4DOF robotics head was also developed. The study of this mechanism has important theoretical implications, and led to a publication in the Journal of Mechanisms and Machine Theory.
Additionally, considerations on the integration of information from multiple noisy sensors lead to another theoretical publication in the Communications in Nonlinear Science and Numerical Simulations Journal.