Umi on Legs
Demo by: Huy Quoc Ha
Provided by the Robotics and Embodied AI Lab (PI: Shuran Song) in collaboration with SRC
Umi on Legs
UMI on Legs is a framework for cross embodiment deployment of robot manipulation policies with two parts. The first part is Universal Manipulation Interface (UMI), which is a hand held gripper with a camera, that allows anyone to intuitively collect robot demonstrations anywhere. This data is used to train a robot policy, that predicts gripper movements from image inputs. The second part involves training a Whole-body Controller, which outputs the robot's joint controls from target gripper movements. The teacup demo see here is created by combining data collected by humans (with no robots, just the gripper) and directly deployed on a quadruped. By separating the data collection process from the robot, the data collection process is not only cheaper, but also future-proofed for any future robot hardware platform.