Open-TeleVision: Teleoperation with Immersive Active Visual Feedback
Date:July 2024
Teams:MIT and the University of California, San Diego
Writers:Xuxin Cheng, Jialong Li, Shiqi Yang, Ge Yang, Xiaolong Wang
PDF:Open-TeleVision: Teleoperation with Immersive Active Visual Feedback
Abstract
Teleoperation serves as a powerful method for collecting on-robot data essential for robot learning from demonstrations. The intuitiveness and ease of use of the teleoperation system are crucial for ensuring high-quality, diverse, and scalable data. To achieve this, we propose an immersive teleoperation system Open-TeleVision that allows operators to actively perceive the robot's surroundings in a stereoscopic manner. Additionally, the system mirrors the operator's arm and hand movements on the robot, creating an immersive experience as if the operator's mind is transmitted to a robot embodiment. We validate the effectiveness of our system by collecting data and training imitation learning policies on four long-horizon, precise tasks (Can Sorting, Can Insertion, Folding, and Unloading) for 2 different humanoid robots and deploy them in the real world. The system is open-sourced at:this https URL