Sponsor: NSF CHS 1764092

PI: Dan Szafir, co-PIs: Christoffer Heckman, Danielle Albers Szafir

Abstract: Robots may augment emergency response teams by collecting information in environments that may be dangerous or inaccessible for human responders, such as in wildfire fighting, search and rescue, or hurricane response. For example, robots might collect critical visual, mapping, and environmental data to inform responders of conditions ahead that could improve their awareness of the operational environment. However, response teams currently have little ability to directly access robot-collected information in the field, despite its value for rapidly responding to local conditions, because current systems typically route the data through a central command post. Through collaboration with several local response groups, the project team will develop better understandings of responders' needs and concerns around robot-collected data, algorithms and visualizations that meet those needs using augmented reality technologies, and systems that integrate well with responders' actual work practices. The team will also develop novel algorithms for 3D scene reconstruction and simultaneous location and mapping that will be useful for a broad variety of applications. Overall, the project will contribute empirical knowledge of how different factors of ARHMD visualizations influence data interpretation, novel algorithms for estimating, correcting, and sharing maps between intermittently-networked agents in the field, and information regarding how data from collocated robots can mediate human-robot interactions, particularly within the context of emergency response.

Publications

Chen, Zhaozhong and Heckman, Christoffer and Julier, Simon and Ahmed, Nisar. "Weak in the NEES?: Auto-Tuning Kalman Filters with Bayesian Optimization," 2018 21st International Conference on Information Fusion (FUSION 2018), 2018. doi:10.23919/ICIF.2018.8454982  Citation details

Walker, Michael E. and Szafir, Daniel and Rae, Irene. "The Influence of Size in Augmented Reality Telepresence Avatars," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR 2019), 2019. doi:10.1109/VR.2019.8798152  Citation details

Szafir, Daniel. "Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality," International Conference on Human-Computer Interaction, v.11575, 2019. Citation details

Elliott, M., C. Xiong, C. Nothelfer, & D. Albers Szafir. “A Design Space of Vision Science Methods for Visualization Research.” IEEE Transactions on Visualization, 2021 (to appear).

Whitlock, M., K. Wu, & D. Albers Szafir. “Designing for Mobile and Immersive Visual Analytics in the Field.” IEEE Transactions on Visualization and Computer Graphics (TVCG), 2019.

Whitlock, M., D. Albers Szafir, & Kenny Gruchalla. “HydrogenAR: Interactive Data-Driven Storytelling for Dispenser Reliability.” In the Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2020 (to appear).

Whitlock, M., S. Smart, & D. Albers Szafir. “Graphical Perception for Immersive Analytics.” IEEE Virtual Reality, 2020.

Whitlock, M., J. Mitchell, N. Pfeufer, B. Arnot, R. Craig, B. Wilson, B. Chung, & D. Albers Szafir. “MRCAT: In Situ Prototyping of Interactive AR Environments.” International Conference on Virtual and Mixed Reality (VAMR), 2020.

Whitlock, M., D. Leithinger, D. Szafir, & D. Albers Szafir. “Toward Effective Multimodal Interaction in Augmented Reality.” 4th Workshop on Immersive Analytics: Envisioning Future Productivity for Immersive Analytics at ACM CHI 2020, 2020.

Whitlock, M., & D. Albers Szafir. “Situated Prototyping of Data-Driven Applications in Augmented Reality.” Interaction Design and Prototyping for Immersive Analytics at CHI 2019, 2019.