Trust & Teaming

2014 - Current
Project Leader: Gale Lucas


Background
Now that humans are teaming with automated agents, trust is a critical factor. Trust in automation has never been a given, for a variety of reasons, including fear of losing jobs to automation. But trust issues around autonomy are complex. People’s trust in automation also depends on their perception of automation – is it capable of doing the required task? Will it understand their goals? People can over trust as well – and then abandon the tool if it makes a mistake. How can trust be repaired within the human-automation relationship?

Even outside of work, people can be suspicious of the new forms of automation. Current traditional training resources have failed to focus on, or build, trust into the human-machine relationship. This is a huge problem because when users trust the automation less than they could (based on its capabilities), they will underutilize the system and fail to reap the benefits of partnering with automated agents.

Objectives
We use VLE (Virtual Learning Environments), among other research paradigms, to enable us to study how, and when, relational factors can facilitate trust in automation, especially in automated teammates. Much of the prior research in this area has taken a more “informational approach,” such that trust is built based on increasing users’ understanding of the automation. In our work we have gone further, to explore how human-like factors such as relational factors (e.g. rapport-building dialogue, affective listening), and natural interactions (e.g. with natural language, contingent agent behaviors) have a part to play in engendering trust in human-machine relationships.

Our research was jointly-funded by the U.S. Army Research Laboratory and DARPA, with additional support from NSF’s Research on Emerging Technologies for Teaching and Learning program.

Results
In a sample of veterans and civilians, a virtual human interviewer doubles people’s willingness to share sensitive or personal details about their lives compared to a human interviewer.

National guard members trusted a virtual human interviewer, being willing to share 3x more symptoms of PTSD with the virtual human than with their commander.

Construction workers’ trust in a demolition robot increased 3x more when trained using our VR-based training compared to the traditional method of training in person.

Next Steps
Published academic research papers are available here. For more information Contact Us

Download One-Sheet PDF