Subproject: Auditory social interaction enhancement
Communication between systems of action works in different modalities: besides spoken language there are visual characteristics like mimicry and gestures as well as posture and body movement or features of constellation (e. g. position to each other, spatial distance or proximity, line of vision etc.) delivering relevant information. A central starting point of the research project consists in the assumption that even complex forms of social interaction are widely based on fundamental senso-motoric patterns. Learning and controlling of action-effect-contingencies is considered as an integral part of acting systems which cooperate in social contexts. This applies for human acting together as well as for human-machine-interaction, e.g. if humans want to cooperate with a humanoid robot. Therefore, it is the target of the current research project to gain further information about these relationships. This may enhance the future comprehensibility of humanoid robots cooperating with humans. In this respect the Science in Motion Research Group of the Leibniz University Hanover primarily focusses on auditory coding of kinematic characteristics and features of constellation in nonverbal communication. Auditory social interaction enhancement is a subproject within the EU H2020 FETPROACT project Socialising Sensory-Motor Contingencies (socSMCs).
Prof. Dr. Alfred Effenberg, Dr. Gerd Schmitz (Hanover)
EU H2020 FETPROACT