Problem

How to design haptic feedback so that they can convey semantic meaning to users?


Goal

Developing the affective and perception layers of the humanoid for better human-interaction experience.


Role

Research and Development


Advisor

Prof. Nadia Thalmann


​Nadine’s platform is implemented as a classic Perception-Decision-Action architecture as described below:

  1. The perception layer is composed of a Microsoft Kinect V2 and a microphone. The perception includes face recognition, gestures recognition and some understanding of social situations.
  2. In regards to decision, the platform includes emotion and memory models as well as social attention.
  3. Finally, the action layer consists of a dedicated robot controller which includes emotional expression, lips synchronization and online gaze generation. Controlling and monitoring many of these modules require a tool to quickly identify an issue, if any. The aim of this project was to develop a module that binds these functions and represent a scalable social robotics platform.


Nadine Framework

So the first phase included revamping of the perception layer. As we were moving from a C++ to Python based architecture this included rebilding of the face recognition, gesture recognition and the modules concerned with understanding of the social situations. In the second phase, I reimplemented the affective layer which included a behaviour decision tree which mapped the feeded sentences and the possible resposes to the corresponding reactions and emotions.