Gaze-based Control

for assistive arms

image of person piloting the gaze-based control demo

Demo by: Shivani Guptasarma and Zhongchun Yu

Provided by the ARMLab Stanford (PI: Monroe Kennedy III) in collaboration with SRC.

Gaze-based control for assistive arms

Assistive robots such as the Jaco arm are operated manually with a joystick, sequentially switching through control modes, by users who may often experience movement disabilities in the upper limb. We study ways to simplify the control of high degree-of-freedom robotic arms, using interfaces that can gather additional information from both the user and the environment. In this demo, we use gaze tracking and cameras on the Microsoft Hololens 2, together with surface electromyography (EMG) from an OyMotion armband, to move objects with a Jaco robot. Our previous work implementing a similar approach with virtual prosthetic arms (https://arm.stanford.edu/proact) suggests that when only severely-limited control inputs are available from a human user, gaze-based methods can make complex tasks easier to perform.