--- id: sensors-perception title: "Sensors & Perception" status: established source_sections: "reference/sources/official-product-page.md, reference/sources/official-developer-guide.md" related_topics: [hardware-specs, locomotion-control, sdk-programming, safety-limits] key_equations: [] key_terms: [imu, state_estimation, realsense_d435i, livox_mid360, tactile_sensor, dual_encoder] images: [] examples: [] open_questions: - "IMU model and noise characteristics" - "D435i resolution/FPS as configured on G1 specifically" - "Force/torque sensor availability per variant" --- # Sensors & Perception Sensor suite, specifications, and perception capabilities of the G1. ## 1. Sensor Suite Overview | Sensor Type | Model/Spec | Location | Data Type | Network/Interface | Tier | |---------------------|--------------------------|--------------|----------------------|------------------------|------| | Depth Camera | Intel RealSense D435i | Head | RGB-D | USB, SDK access | T0 | | 3D LiDAR | Livox MID360 | Head | 360° point cloud | Ethernet (192.168.123.20) | T0 | | IMU | Onboard IMU | Body (torso) | 6-axis accel+gyro | Internal, via LowState | T0 | | Microphone Array | 4-element array | Head | Audio (spatial) | Internal | T0 | | Speaker | 5W stereo | Head | Audio output | Internal | T0 | | Joint Encoders | Dual encoders per joint | Each joint | Position + velocity | Internal, via LowState | T0 | | Temperature Sensors | Per-motor | Each actuator| Temperature | Internal, via LowState | T0 | | Tactile Sensors | 33 per hand (Dex3-1) | Fingertips | Force/touch | Via Dex3-1 DDS topic | T0 | [T0 — Official product page and developer guide] ## 2. Depth Camera — Intel RealSense D435i The primary visual sensor for RGB-D perception, obstacle detection, and environment mapping. - **Standard D435i specs:** 1280x720 @ 30fps (depth), 1920x1080 @ 30fps (RGB), 87° x 58° FOV (depth) - **Note:** G1-specific configuration (resolution, FPS) may differ from standard D435i defaults — see open questions - **Mounting:** Head-mounted, forward-facing - **Access:** Via RealSense SDK on development computer, or via SDK2 camera API - **Research:** Used for end-to-end vision-based locomotion (arXiv:2602.06382) with depth augmentation [T1] ## 3. 3D LiDAR — Livox MID360 360-degree environment scanning for navigation, SLAM, and obstacle avoidance. - **Horizontal FOV:** 360° - **Vertical angle:** 59° - **Network:** Ethernet at 192.168.123.20 - **Access:** Requires separate Livox driver setup (unilidar_sdk2) - **Use cases:** SLAM, navigation, environment mapping, point cloud generation ## 4. IMU (Inertial Measurement Unit) Critical for balance and state estimation during locomotion. - **Type:** 6-axis (3-axis accelerometer + 3-axis gyroscope) - **Access:** Available in `rt/lowstate` → `imu_state` field - **Used by:** Locomotion computer for real-time balance control, state estimation - **User access:** Via SDK2 LowState subscription ## 5. Joint Proprioception Dual encoders per joint provide high-accuracy feedback: - **Position feedback:** Absolute joint angle (rad) - **Velocity feedback:** Joint angular velocity (rad/s) - **Access:** `rt/lowstate` → `motor_state` array - **Rate:** Updated at 500 Hz (control loop frequency) ## 6. Perception Pipeline ``` Sensors → Locomotion Computer → DDS → Development Computer → User Applications │ │ │ └── IMU + encoders → state estimation → balance control (internal) │ ├── D435i → RealSense SDK → RGBD frames → perception/planning ├── MID360 → Livox driver → point cloud → SLAM/navigation └── Microphones → audio processing → voice interaction ``` The locomotion computer handles proprioceptive sensing (IMU, encoders) internally for real-time control. Exteroceptive sensors (cameras, LiDAR) are processed on the development computer. [T1 — Inferred from architecture] ## 7. State Estimation The locomotion computer fuses IMU and joint encoder data for real-time state estimation: [T2 — Observed from RL papers] - Body orientation (roll, pitch, yaw) from IMU - Angular velocity from IMU gyroscope - Joint positions and velocities from encoders - Contact state inference for foot contact detection This state estimate feeds directly into the RL-based locomotion policy at 500 Hz. ## Key Relationships - Mounted on: [[hardware-specs]] - Feeds into: [[locomotion-control]] (balance, terrain adaptation) - Accessed via: [[sdk-programming]] (sensor data API) - Network config: [[networking-comms]]