Robin

ROBot-human INteraction

Description

The interaction between man and robot is often limited to input devices like keyboards and mice. Future applications of mobile robots will use natural interaction. Service robots for example should be able to help a human doing his housework. It is necessary to control such a robot without specific technical knowledge. Especially old people want to communicate in a natural way.

The communication between humans is not only based on speech. It is a complex summary of speech, gestures, mimics and emotional expressions. Therefore, it is necessary to observe the movements of a communication partner. These movements can be the expression of emotions with the help of the skin or a hand wave by robot. Robin is an advanced upper-torso robot with a backlit-project face manufactured by Engineered Arts as Robothespian.

This research work focuses on efficient human-robot-interaction. The goal is to recognize human emotions and predict their intentions to interact in a natural human-like way. In this regard, a lot of work has been conducted by RRLAB in the recognition of face, human, facial expressions, hand gestures (static and dynamic), head gestures, head poses and some higher level perception tasks like human feedback and intentions. A dialog system has been established which uses these basic modules and generates a scenario which helps the robot to interact with humans naturally. An important step towards natural communication is therefore the dynamic modeling of humans. This model enables the robot to interpret the movements of a communication partner and react adequately.

Over the years, affective interaction has been established based on the real-time recognition of human non-verbal cues. Inclusion of verbal cues was very essential to better understand what an interaction partner conveys.  Recent works of RRLAB focus on the recognition of verbal cues during interaction. Text-based interaction scenarios have successfully been experimented with a real-time speech recognition system integrated to enable verbal interaction. Various information extraction strategies have been employed to to ensure a notion of context awareness during interaction. Recent research focuses on a scientific framework triggering the improvement of linguistic as well as cognitive skills of the humanoid robot.

Images

Videos

Publications

Sort by: Author, Year, Title

  • “You Scare Me”. The Effects of Humanoid Robot Appearance, Emotion, and Interaction Skills on Uncanny Valley Phenomenon.
    Karsten Berns and Ashita Ashok
    Actuators, S. 19. (2024)
    https://www.mdpi.com/2076-0825/13/10/419
  • Multimodal Episodic Analysis of Human Personality Traits for Personalized Behavior in Social Robots.
    Sarwar Paplu, Bhalachandra Bhat and Karsten Berns
    Proceedings of the 22nd IEEE-RAS International Conference on Humanoid Robots (Humanoids), (2023)
    DOI: 10.1109/Humanoids57100.2023.10375182
  • Personalized Human-Robot Interaction Based on Multimodal Perceptual Cues.
    Sarwar Paplu
    (2023)
  • Social Perception and Scene Awareness in Human-Robot Interaction.
    Sarwar Paplu, Prabesh Khadka, Bhalachandra Bhat and Karsten Berns
    Proceedings of the 15th International Conference on Social Robotics (ICSR2023), S. 123 - 132. (2023)
    https://doi.org/10.1007/978-981-99-8718-4_11
  • Social Robot Dressing Style. An evaluation of interlocutor preference for University Setting.
    Ashita Ashok, Sarwar Paplu and Karsten Berns
    2023 32nd IEEE International Conferenceon Robot and Human Interactive Communication (RO-MAN), S. 6. (2023)
    DOI: 10.1109/RO-MAN57019.2023.10309491
  • Harnessing Long-term Memory for Personalized Human-Robot Interactions.
    Sarwar Paplu, Raúl Navarro and Karsten Berns
    Proceedings of the 21st IEEE-RAS International Conference on Humanoid Robots (Humanoids), S. 377 - 382. (2022)
    DOI: 10.1109/Humanoids53995.2022.10000213
  • Multimodal Perceptual Cues for Context-aware Human-Robot Interaction.
    Sarwar Paplu, Hasnat Ahmed, Ashita Ashok, Sevilay Akkus and Karsten Berns
    The Joint International Conference of the 13th IFToMM International Symposium on Science of Mechanisms and Machines (SYROM 2022) and the XXV International Conference on Robotics (ROBOTICS 2022), Vol. 127, S. 283 - 294. (2022)
    https://doi.org/10.1007/978-3-031-25655-4_29
  • Paralinguistic Cues in Speech to Adapt Robot Behavior in Human-Robot Interaction.
    Ashita Ashok, Jakub Pawlak, Sarwar Paplu, Zuhair Zafar and Karsten Berns
    Proceedings of the 9th IEEE RAS/EMBS International Conference on Biomedical Robotics & Biomechatronics (BioRob), (2022)
    DOI: 10.1109/BioRob52689.2022.9925505
  • Real-time Emotion Appraisal with Circumplex Model for Human-Robot Interaction.
    Sarwar Paplu, Chinmaya Mishra and Karsten Berns
    arXiv preprint arXiv:2202.09813, (2022)
    https://arxiv.org/abs/2202.09813
  • Towards Emotion-aware Personalized Human-Robot Interaction.
    Sarwar Paplu
    5th Young Researchers Symposium 2022, S. 43. (2022)
  • Adapting Behaviour of Socially Interactive Robot based on Text Sentiment.
    Sarwar Paplu, Mohammed Arif and Karsten Berns
    30th International Conference on Robotics in Alpe-Adria-Danube Region (RAAD), Vol. Advances in Service and Industrial Robotics, S. 221 - 228. (2021)
    https://doi.org/10.1007/978-3-030-75259-0_24
  • Exploiting Conversational Topic Awareness for Interactive Robots.
    Sarwar Paplu, Mohammed Arif, Nishan Tamang and Karsten Berns
    Proceedings of the 20th IEEE International Conference on Advanced Robotics (ICAR), S. 474 - 479. (2021)
    DOI: 10.1109/ICAR53236.2021.9659484
  • Personality Traits Assessment using PAD Emotional Space in Human-robot Interaction.
    Zuhair Zafar, Ashita Ashok and Karsten Berns
    Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 2, S. pp. - 111. (2021)
  • Towards Linguistic and Cognitive Competence for Socially Interactive Robots.
    Sarwar Paplu and Karsten Berns
    Proceedings of the 9th IEEE International Conference on Robot Intelligence Technology and Applications (RITA), Nr. 429, S. 520 - 530. (2021)
    DOI: 10.1007/978-3-030-97672-9_47
  • Utilizing Semantic and Contextual Information during Human-Robot Interaction.
    Sarwar Paplu, Mohammed Arif and Karsten Berns
    Proceedings of the 11th IEEE International Conference on Development and Learning (ICDL), S. 1 - 6. (2021)
    https://doi.org/10.1109/ICDL49984.2021.9515611
  • Multimodal Fusion of Human Behavioural Traits. A Step Towards Emotionally Intelligent Human-Robot Interaction.
    Zuhair Zafar
    (2020)
    http://nbn-resolving.org/urn:nbn:de:hbz:386-kluedo-59800
  • Pseudo-Randomization in Automating Robot Behaviour during Human-Robot Interaction.
    Sarwar Paplu, Chinmaya Mishra and Karsten Berns
    Proceedings of the 10th IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-ER), S. 120 - 125. (2020)
    https://doi.org/10.1109/ICDL-EpiRob48136.2020.9278115
  • Automatic Assessment of Human Personality Traits. A Step towards Intelligent Human-Robot Interaction.
    Zuhair Zafar, Sarwar Paplu and Karsten Berns
    2018 IEEE-RAS 18th International Conference on Humanoid Robotics (Humanoids), S. 670 - 675. (2018)
    https://doi.org/10.1109/HUMANOIDS.2018.8624975
  • Emotion Based Human-Robot Interaction.
    Karsten Berns and Zuhair Zafar
    Proceedings of the 13th International Scientific-Technical Conference on Electromechanics and Robotics "Zavalishin’s Readings" (ER(ZR)), (2018)
  • Real-Time Recognition of Human Postures for Human-Robot Interaction.
    Zuhair Zafar, Rahul Venugopal and Karsten Berns
    Proceedings of the 11th International Conference on Advances in Computer-Human Interactions (ACHI), S. 114 - 119. (2018)
  • Real-time Recognition of Extroversion-Introversion Trait in Context of Human-Robot Interaction.
    Zuhair Zafar, Sarwar Paplu and Karsten Berns
    Advances in Service and Industrial Robotics, Vol. 67, S. 63 - 70. (2018)
    https://doi.org/10.1007/978-3-030-00232-9
  • Ability of Humanoid Robot to Perform Emotional Body Gestures.
    Djordje Urukalo, Ljubinko Kevac, Zuhair Zafar, Salah Al-Darraji, Aleksandar Rodić and Karsten Berns
    Advances in Service and Industrial Robotics, Vol. 49, S. 657 - 664. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Human Robot Interaction using Dynamic Hand Gestures.
    Zuhair Zafar, Daniel Villarreal, Salah Al-Darraji, Djordje Urukalo, Berns Karsten and Aleksandar Rodić
    Advances in Service and Industrial Robotics, Vol. 49, S. 647 - 656. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Interactive Communication Between Human and Robot Using Nonverbal Cues.
    Salah Al-Darraji, Zuhair Zafar, Karsten Berns, Djordje Urukalo and Aleksandar Rodić
    Advances in Service and Industrial Robotics, Vol. 49, S. 673 - 680. (2017)
    http://www.springer.com/gp/book/9783319612751 ISBN: 978-3-319-61275-1
  • Action Unit Based Facial Expression Recognition Using Deep Learning.
    Salah Al-Darraji, Karsten Berns and Aleksandar Rodić
    Proceedings of the 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016, Vol. 540, S. 413 - 420. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Embodiment of Human Personality with EI-Robots by Mapping Behaviour Traits from Live-Model.
    Aleksandar Rodić, Djordje Urukalo, Milica Vujović, Sofija Spasojević, Marija Tomić, Karsten Berns, Salah Al-Darraji and Zuhair Zafar
    Advances in Intelligent Systems and Computing, Vol. 540, S. 437 - 448. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Perception of Nonverbal Cues for Human-Robot Interaction.
    Salah Al-Darraji
    (2016)
    http://www.dr.hut-verlag.de/9783843928526.html ISBN-13: 978-3-8439-2852-6
  • Real-time Perception of Nonverbal Human Feedback in a Gaming Scenario.
    Salah Al-Darraji, Zuhair Zafar and Karsten Berns
    Proceedings of the 2016 British HCI Conference, (2016)
  • Recognizing Hand Gestures Using Local Features. A Comparison Study.
    Zuhair Zafar, Karsten Berns and Aleksandar Rodić
    Advances in Intelligent Systems and Computing, Vol. 540, S. 394 - 401. (2016)
    http://www.springer.com/gp/book/9783319490571 ISBN: 978-3-319-49057-1
  • Recognizing Hand Gestures for Human-Robot Interaction.
    Zuhair Zafar and Karsten Berns
    Proceedings of the 9th International Conference on Advances in Computer-Human Interactions (ACHI), S. 333 - 338. (2016)
  • A Multimodal Nonverbal Human-Robot Communication System.
    Salah Saleh, Manish Sahu, Zuhair Zafar and Karsten Berns
    Proceedings of the 6th International Conference on Computational Bioengineering (ICCB), (2015)