Advancing Human-to-Robot Motion Translation
A groundbreaking paper titled “Robust and Expressive Humanoid Motion Retargeting via Optimization‑Based Rig Unification” has been officially accepted to IROS 2025, which will be held October 19–25 in Hangzhou, China. Led by Taemoon Jeong, the research—conducted in collaboration with NAVER LABS Corp. and Joohyung Kim at UIUC—introduces a powerful, versatile motion-retargeting pipeline that translates human movements into expressive and physically realistic robot motions across diverse humanoid platforms
Key Innovations: Rig Unification Meets Real-Time Execution
- Rig Unification: The method maps human skeleton poses to a unified kinematic rig, regardless of robot morphology.
- Optimization-Based Refinement: Balancing, contact, joint limits, and trajectory smoothness are optimized for physical feasibility.
- The pipeline supports noisy video input, enabling retargeting from motion capture systems or RGB video streams.
- Validated on 12 simulated humanoids and 3 physical robots—including the AMBIDEX platform—preserving expressive dynamics and human intent in robot movements
Real-World Relevance: Bridging Simulation and Reality
The team ran real-world retargeting experiments at NAVER LABS’ headquarters. The resulting robot motions—not just simulations—demonstrated how expressive, physically plausible movements can be generated even from imperfect input data This capability paves the way for natural human-robot interaction in diverse domains like entertainment, telepresence, and assistive robotics.
Why It Matters: Toward Adaptive, Expressive Humanoids
Motion retargeting is critical for embodied AI systems to behave naturally and adaptively. This work advances the field by:
- Handling structural differences across robots via rig unification
- Enabling real-time execution on actual hardware
- Preserving both expressiveness and feasibility across platforms
By supporting multiple hardware configurations—including the dexterous AMBIDEX—the research moves closer to general-purpose humanoid deployment.
Upcoming Showcase at IROS 2025
We expect a full presentation of this methodology at IROS 2025 in Hangzhou, one of the premier global events for intelligent systems and robotics technologies The conference will provide fertile ground for further discussion and collaboration around motion synthesis, full-body expressivity, and human-robot interaction.
Summary
Led by Taemoon Jeong, this accepted IROS 2025 paper delivers:
- A unified rig-based motion retargeting pipeline
- Real-world validation across 3 humanoid robots
- Expressive and physically plausible robot motion from noisy video input
- A scalable approach to real-time humanoid movement synthesis
Credit : Sungjoon Choi
READ ARTICLE