3.7 KiB
XR Teleoperate
Source: https://github.com/unitreerobotics/xr_teleoperate Fetched: 2026-02-13 Type: GitHub Repository README
XR Teleoperation for Unitree Humanoid Robots
Project Overview
This repository implements teleoperation control of Unitree humanoid robots using XR (Extended Reality) devices such as Apple Vision Pro, PICO 4 Ultra Enterprise, or Meta Quest 3.
Supported Robots and End-Effectors
Robot Arms:
- G1 (29 DoF and 23 DoF variants)
- H1 (4-DoF arm)
- H1_2 (7-DoF arm)
End-Effectors:
- Dex1-1 gripper
- Dex3-1 dexterous hand
- Inspire dexterous hand (FTP and DFX variants)
- BrainCo dexterous hand
Installation Requirements
The project requires Ubuntu 20.04 or 22.04. Key dependencies include:
- Python 3.10 with Pinocchio 3.1.0 and NumPy 1.26.4 via conda
- unitree_sdk2_python library for robot communication
- Televuer module (with SSL certificate configuration for HTTPS/WebRTC)
- Teleimager submodule for image streaming
Installation involves:
- Creating a conda environment with required packages
- Cloning the repository and initializing submodules
- Installing televuer and teleimager packages
- Generating SSL certificates for secure XR device connections
- Installing the unitree_sdk2_python SDK
Deployment Modes
The system supports multiple operational modes:
Control Modes:
- Hand tracking (gesture-based control)
- Controller tracking (device controller input)
Display Modes:
- Immersive (full VR environment)
- Ego (pass-through plus first-person window)
- Pass-through only
Operation Modes:
- Simulation (using Isaac Lab)
- Physical robot control
- Motion control (simultaneous arm and locomotion)
- Headless mode (for devices without displays)
- Recording mode for data collection
Data Recording and Collection
The system enables teleoperation data recording when launched with the --record flag. Users can:
- Press 'r' to begin teleoperation
- Press 's' to start recording; press 's' again to stop and save an episode
- Repeat the process for multiple episodes
Recorded data is stored in xr_teleoperate/teleop/utils/data and can be processed for imitation learning training through the unitree_IL_lerobot repository.
Launch Parameters
Key command-line arguments control system behavior:
--frequency: Control loop FPS (default: 30.0)--input-mode: hand or controller tracking--display-mode: immersive, ego, or pass-through--arm: Robot type (G1_29, G1_23, H1_2, H1)--ee: End-effector selection--img-server-ip: Image server IP address--network-interface: CycloneDDS interface configuration--motion: Enable concurrent motion control--sim: Enable simulation mode--record: Enable data recording mode
Physical Deployment Specifics
Physical deployment requires additional setup beyond simulation:
Image Service Setup: The development computing unit (PC2) on the robot must run the teleimager service to stream camera feeds. Users must:
- Clone and install teleimager on PC2
- Copy SSL certificates from the host machine
- Configure camera settings in
cam_config_server.yaml - Start the image service before teleoperation
Hand Services (Optional): Depending on end-effector type, additional services may be required:
- Inspire hands require the DFX_inspire_service program
- BrainCo hands require their dedicated service
- Unitree Dex1-1 grippers have built-in support
XR Device Connectivity
XR devices connect via:
- WebRTC for real-time video streaming (requires certificate trust)
- WebSocket for control signal transmission
- HTTPS for secure web interface access
Users must navigate to the web interface URL (typically https://192.168.x.x:8012) on their XR device, trust the self-signed certificate, and enter VR mode to begin control.