Multi-robot Flocking
Demo remounting for the opening of the Stanford Robotics Center by: Michelle Pan, Catie Cuan
Original project by Catie Cuan, Kyle Jeffrey, Kim Kleiven, Adrian Li, Emre Fisher, Matt Harrison, Benji Holson, Allison Okamura, Matt Bennice
Music by: Peter Van Straten in collaboration with Tom Engbersen
Original project supported by Everyday Robots and Stanford University
SRC project supported by Hello Robot and the Stanford Robotics Lab
Dynamic Human-Robot Interaction through Adaptive Algorithms
As robots transition from commercial and research settings into everyday environments, social task aims such as engagement or entertainment become increasingly relevant. This work presents a compelling multi-robot task, in which the main aim is to enthrall and interest. In this task, the goal is for a human to be drawn to move alongside and participate in a dynamic, expressive robot flock. Towards this aim, the research team created algorithms for robot movements and engaging interaction modes such as gestures and sound. The contributions are as follows: (1) a novel group navigation algorithm involving human and robot agents, (2) a gesture responsive algorithm for real-time, human-robot flocking interaction, (3) a weight mode characterization system for modifying flocking behavior, and (4) a method of encoding a choreographer's preferences inside a dynamic, adaptive, learned system.