Situation Awareness and Trust Methods for Improving Human-Robot Team Performance
Ali, Arsha
2025
Abstract
The field of human-robot interaction is dedicated to understanding how humans interact with robots, and developing interventions to facilitate desirable interactions. Within the expansive field of human-robot interaction, this dissertation explores two fundamental concepts: trust and situation awareness. Trust guides the relationships between humans and robots and will ultimately be needed for robot acceptance and appropriate use by humans. Situation awareness in humans and robots is needed for decision making and action execution, to effectively achieve goals. With advancements in robot sensing and reasoning, it is envisioned that robots will take trust and situation awareness constructs into consideration to influence their behaviors. This dissertation explores how trust and situation awareness can facilitate and apply to interactions between humans and robots. Three high-level problems motivate the work in this dissertation: (1) how tasks can be allocated between humans and robots, (2) how appropriate trust can be fostered, and (3) how situation awareness can be improved. Methods are presented for allocating tasks between humans and robots using trust, understanding team trust dynamics, and perceiving and enhancing situation awareness. The purpose of addressing these problems is to improve human-robot interactions, and consequently, improve human-robot team performance. To address the problem of allocating tasks between humans and robots, we present a human-robot task allocation method that incorporates trust. The task allocation method allocates both familiar tasks and novel tasks the team has not seen before, and learns an agent's initially unknown capabilities. In simulation, our task allocation method outperforms other methods. Our task allocation method is provided as a contribution and solution to the task allocation problem. Although the problem of fostering appropriate trust remains open, we take a step towards that direction with an analysis of longitudinal experiment data from a large team working with an unreliable autonomous system. Despite the system's persistent unreliability, the team's trust in and engagement with the system does not decrease. The contribution demonstrates how autonomy can maintain trust and engagement even when performing unreliably, bringing attention to the need for strategies to address inappropriate trust and reliance in large teams. For the problem of improving situation awareness, a multi-phase solution is presented. An experiment on how shared mental models and communication amount impact team situation awareness contributes with an expression of team situation awareness and demonstration of how shared mental models are critical for team situation awareness when communication is limited. An experiment on communication sources clarifies the crucial role of external communication in enhancing human situation awareness. Then, a situation awareness system that estimates human situation awareness in real-time and adapts robot behavior in response is developed and evaluated. Correlations in non-intrusive eye-tracking and behavioral measures are leveraged to implement a binary situation awareness estimator through a logistic regression approach. Robot behavior is adapted to communicate warnings when the human's situation awareness is estimated to be insufficient for the present context. Through experimentation, the situation awareness system is demonstrated to improve situation awareness and performance. This dissertation enriches our knowledge on how humans interact with robots, and provides implications for robot design and behavior to improve human-robot team performance. With effective human-robot interaction, robots can integrate as collaborative teammates to humans. These works move closer to a reality where humans and robots harmoniously work in unison to achieve goals and enhance society.Deep Blue DOI
Subjects
human-robot teaming situation awareness trust
Types
Thesis
Metadata
Show full item recordCollections
Remediation of Harmful Language
The University of Michigan Library aims to describe its collections in a way that respects the people and communities who create, use, and are represented in them. We encourage you to Contact Us anonymously if you encounter harmful or problematic language in catalog records or finding aids. More information about our policies and practices is available at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.