Humanoid Robots

Bipedal Locomotion: Progress and Innovations

Bipedal robots lag behind their biological counterparts in agility and efficiency, primarily due to challenges in replicating the human musculoskeletal system. At the Robotics Research Lab, efforts to close this gap focus on bio-inspired control strategies and advanced mechanical designs. A key innovation is the Compliant Robotic Leg (CARL), featuring back-drivable and compliant Series Elastic Actuators (RRLAB-SEAs) that enable human-like attributes such as compliant muscles, force amplification, low inertia, agility, and reduced power consumption.

To advance Bio-inspired Behavior-Based Bipedal Locomotion Control (B4LC), CARL exploits passive dynamics and mimics anthropomorphic traits like actuation, kinematic layout, and weight distribution. Its mono- and bi-articular actuation system, coupled via direct and four-bar linkages, provides a robust and redundant structure. Inspired by human amputees' success with SEAs and prosthetic feet, CARL incorporates a commercial prosthetic foot to achieve natural, energy-efficient walking.

A dedicated test rig, integrating a treadmill and lifting mechanism to simulate a second virtual leg, validated CARL’s impedance and force control, even in coupled configurations. Preliminary walking motions were achieved by porting a subsystem of B4LC to CARL, demonstrating its feasibility for real-world applications. These efforts complement studies on simulated bipeds, such as a 1.80 m, 76 kg robot, which showcased superior adaptability to uneven terrain and disturbances.

CARL represents a significant step toward deploying B4LC on humanoid platforms. Future research aims to integrate CARL into larger robotic systems, refining its ability to handle complex, adaptive walking behaviors while bridging the gap between simulation and physical implementation.
 

Robin and Emah: Advancing Human-Robot Interaction (HRI)

Human-Robot Interaction (HRI) focuses on creating natural and efficient communication between humans and robots through multimodal interaction. By utilizing gestures, facial expressions, and contextual cues, robots like Robin and Emah from Engineered Arts push the boundaries of human-like interaction.

Robin: A Proven Platform for HRI

Robin, equipped with a back-lit projected face and 35 degrees of freedom in its upper body, supports rich interaction through expressive facial gestures and intelligent hand movements. Key features include:

  • Language Capabilities: Built-in speech synthesis in English and German.
  • Interaction Features: Gesture recognition, facial expression tracking, and personality detection through non-verbal cues.
  • Validation Scenarios: Demonstrated effectiveness in 20-question games and general interactive scenarios by adapting to user behavior and contextual information.

Robin's ability to detect low-level perceptual features such as gaze, ethnicity, and posture, combined with high-level personality recognition, enables specific and personalized interactions. These features have been instrumental in validating human feedback behavior and personality detection in HRI.

Emah: A Next-Generation HRI System

The Emah system, implemented on the Generation 1 Ameca robot, builds upon Robin’s foundation with enhanced hardware and software capabilities:

  • Advanced Perception: Integrates external sensors, including a ZED2 camera and microphone, for improved scene awareness and human detection.
  • Speech and Emotion: Leverages Google’s Speech-to-Text and team-developed emotion generation for lifelike speech and lipsync, paired with Ameca’s expressive capabilities.
  • Realistic Design: Features a silicon-based lifelike face, binocular eye-mounted cameras, and a chest-mounted ZED2 camera for tracking and eye contact.

Emah has been validated in studies focusing on social perception, scene awareness, and observational learning. Its design, optimized for realistic interaction, enhances engagement and supports user studies in HRI.