# Motion Capture Retargeting Tools for Humanoid Robots **Source:** Multiple (survey compiled from web search) **Fetched:** 2026-02-13 **Type:** Community Survey --- ## Description A survey of open-source tools and frameworks for retargeting human motion capture data to humanoid robots, with emphasis on tools applicable to the Unitree G1. Motion retargeting is the process of adapting motion data captured from a human performer to a robot with different body proportions, joint limits, and kinematic structure. ## Key Features / Contents ### 1. GMR (General Motion Retargeting) -- RECOMMENDED FOR G1 **Repository:** https://github.com/YanjieZe/GMR **Paper:** ICRA 2026 -- "Retargeting Matters: General Motion Retargeting for Humanoid Motion Tracking" **License:** Open-source The most comprehensive and actively maintained retargeting tool with direct G1 support. **Supported Robots (18 platforms)**: - **Unitree G1** (29 DoF): Leg (12) + Waist (3) + Arm (14) - **Unitree G1 with Hands** (43 DoF): Adds 14 hand DoF - **Unitree H1** (19 DoF): Leg (10) + Waist (1) + Arm (8) - **Unitree H1-2** (27 DoF): Leg (12) + Waist (1) + Arm (14) - Booster T1, K1; HighTorque Hi; Galexea R1 Pro; KUAVO; Berkeley Humanoid Lite; PAL Talos; and more **Supported Input Formats**: - SMPL-X (AMASS, OMOMO datasets) - BVH (LAFAN1, Nokov, Xsens) - FBX (OptiTrack) - PICO streaming (XRoboToolkit) for real-time retargeting **Key Capabilities**: - Real-time retargeting on CPU - Addresses foot sliding, ground penetrations, and self-intersections - Velocity motor limits (default 3pi rad/s) - CSV export compatible with beyondmimic - Retargeter for TWIST/TWIST2 teleoperation systems **Installation**: ```bash conda create -n gmr python=3.10 conda activate gmr git clone https://github.com/YanjieZe/GMR.git cd GMR pip install -e . # Download SMPL-X body models separately ``` --- ### 2. Rofunc **Repository:** https://github.com/Skylark0924/Rofunc **Docs:** https://rofunc.readthedocs.io/ **Lab:** CLOVER Lab, CUHK Full-process Python package for robot learning from demonstration and manipulation, with built-in retargeting. **Features**: - Motion capture data retargeting to heterogeneous humanoid robots - Demonstration collection and pre-processing - LfD (Learning from Demonstration) algorithms - Planning and control methods - CURI synergy-based softhand grasping - Genesis simulator support (December 2024) **G1 Relevance**: Supports retargeting to various humanoid embodiments; can be adapted for G1 with appropriate URDF configuration. **Installation**: ```bash pip install rofunc ``` --- ### 3. LocoMuJoCo **Repository:** https://github.com/robfiras/loco-mujoco **Docs:** https://loco-mujoco.readthedocs.io/ **Paper:** "LocoMuJoCo: A Comprehensive Imitation Learning Benchmark for Locomotion" Imitation learning benchmark with extensive retargeted motion datasets. **Features**: - 12 humanoid and 4 quadruped environments - 4 biomechanical human models - 22,000+ motion capture datasets (AMASS, LAFAN1, native) retargeted for each humanoid - **Robot-to-robot retargeting**: Retarget datasets from one robot to another - Built-in algorithms: PPO, GAIL, AMP, DeepMimic (JAX implementations) - Three dataset types: real (MoCap), perfect, and preference **G1 Relevance**: While primary humanoids are biomechanical models, the robot-to-robot retargeting can transfer to G1. **Installation**: ```bash pip install loco-mujoco ``` --- ### 4. Human2Humanoid / OmniH2O **Repository:** https://github.com/LeCAR-Lab/human2humanoid **Papers:** IROS 2024 ("Learning Human-to-Humanoid Real-Time Whole-Body Teleoperation"), CoRL 2024 ("OmniH2O") Real-time whole-body teleoperation and motion transfer from human to humanoid. **Features**: - Real-time human-to-humanoid motion transfer - Universal and dexterous whole-body teleoperation - Learning-based approach (not just kinematic retargeting) - Supports multiple input modalities **G1 Relevance**: Demonstrated on humanoid platforms; adaptable to G1 with appropriate configuration. --- ### 5. HoloMotion **Repository:** https://github.com/HorizonRobotics/HoloMotion Foundation model for whole-body humanoid control. **Features**: - Translates human motion data to robot-specific kinematic data - Uses GMR internally for retargeting - Produces optimized HDF5 datasets - Foundation model approach for generalizable control **G1 Relevance**: Built on GMR which directly supports G1. --- ### 6. PhysDiff (Physics-Guided Human Motion Diffusion Model) **Project Page:** https://nvlabs.github.io/PhysDiff/ **Paper:** ICCV 2023 (NVIDIA) Not a retargeting tool per se, but generates physically-plausible human motions. **Features**: - Physics-based motion projection module - Uses motion imitation in physics simulator - Reduces floating, foot sliding, and ground penetration artifacts (>78% improvement) - Can be used upstream to generate clean motions before retargeting **G1 Relevance**: Generates higher-quality source motions that produce better retargeting results for G1. --- ### 7. Unitree xr_teleoperate (Official) **Repository:** https://github.com/unitreerobotics/xr_teleoperate Unitree's official teleoperation with built-in hand retargeting. **Features**: - XR device teleoperation (Apple Vision Pro, PICO 4 Ultra Enterprise, Meta Quest 3) - Dexterous hand retargeting algorithm library - Support for Inspire and Unitree dexterous hands - Direct G1 integration **G1 Relevance**: First-party tool, purpose-built for G1 teleoperation. --- ### 8. Retargeted Dataset Collections on HuggingFace Pre-computed retargeted datasets (no retargeting code needed): | Dataset | Source | Robot | Link | |---------|--------|-------|------| | AMASS Retargeted for G1 | EMBER Lab, UC Berkeley | G1 (29 DOF) | [HuggingFace](https://huggingface.co/datasets/ember-lab-berkeley/AMASS_Retargeted_for_G1) | | Retargeted AMASS for Robotics | fleaven | G1, others | [HuggingFace](https://huggingface.co/datasets/fleaven/Retargeted_AMASS_for_robotics) | | G1 Retargeted Motions | openhe | G1 | [HuggingFace](https://huggingface.co/datasets/openhe/g1-retargeted-motions) | | LAFAN1 Retargeting | lvhaidong | H1, H1_2, G1 | [HuggingFace](https://huggingface.co/datasets/lvhaidong/LAFAN1_Retargeting_Dataset) | ## G1 Relevance For G1-specific retargeting, the recommended approach is: 1. **Quick Start**: Use pre-retargeted datasets from HuggingFace (EMBER Lab or fleaven) 2. **Custom Retargeting**: Use GMR with `--robot unitree_g1` for new motion data 3. **Real-time Teleoperation**: Use xr_teleoperate (official) or GMR+TWIST for live retargeting 4. **Training Pipeline**: Feed retargeted data into IsaacLab AMP for locomotion policy training ### Typical G1 Retargeting Workflow ``` Human MoCap Data (AMASS/BVH/FBX) | v Retargeting Tool (GMR recommended) |-- Joint mapping to G1 29-DOF |-- Enforce joint limits from URDF |-- Fix ground contact / foot sliding | v G1 Motion Data (numpy/HDF5) | v Training Framework (IsaacLab AMP / LocoMuJoCo) | v Locomotion Policy (deployable on real G1) ``` ## Related Resources - [awesome-humanoid-robot-learning](https://github.com/YanjieZe/awesome-humanoid-robot-learning) - Comprehensive paper list - [awesome-unitree-robots](https://github.com/shaoxiang/awesome-unitree-robots) - Curated Unitree project collection - [motion-retargeting GitHub topic](https://github.com/topics/motion-retargeting) - Community projects