Assistive Robotics and Manipulation (ARM)
The Assistive Robotics and Manipulation Lab develops robots that improve everyday life by anticipating and acting on the needs of their human counterparts. We specialize in developing intelligent robotic systems that can perceive and model environments, humans, and tasks, and leverage these models to predict system processes and understand their assistive role. The research addresses robotic assistants, connected devices, and intelligent wearables. We use a combination of tools in collaborative robotics, machine learning, computer vision, state estimation and prediction, dynamical systems analysis, and control theory.
Salisbury Robotics Lab
The Salisbury Robotics Lab is currently conducting research on in-hand manipulation (non-anthropomorphic), physical Human/Robot Interaction (pHRI) - currently in the context of a robotic Emergency Medical Technician (rEMT), patient-specific simulation of skull-based procedures such as cochlear implantation in a haptically enabled pre-operative planning environment, and the development of a low-impedance high-dynamic range manipulator concept. The lab is led by Prof. Ken Salisbury, with contributions from students such as Shenli Yuan and Connor Yako.