Imitation learning from human demonstrations in Virtual Reality for physical human-robot interaction in assistance tasks

BMBF, 2019 – 2022

In ILIAS, we propose a novel way of programming robotic assistance tasks, which scales better towards open task domains. In this programming approach humans demonstrate how to accomplish assistance tasks in virtual environments where the demonstrations are automatically interpreted and transformed into generalized knowledge bases. The system then uses the symbolic representation together with high-dimensional log data from virtual reality (VR)  demonstrations in order to generate and parameterize complex robot programs using deep learning.