This project focuses on building robust datasets that capture natural human behaviors elicited during human-robot interactions. For example, we are capturing data on the various types of gestures that users might find intuitive in attempting to give navigational commands to different types of proximal robots, such as ground vehicles, humanoid robots and free-flying platforms.


IRON Lab CU Engineering Magazine