Trust in automated systems the effect of automation level on trust calibration
Approved for public release; distribution is unlimited. === Automated systems perform functions that were previously executed by a human. When using automation, the role of the human changes from operator to supervisor. For effective operation, the human must appropriately calibrate trust in the au...
Main Author: | |
---|---|
Other Authors: | |
Published: |
Monterey, California. Naval Postgraduate School
2012
|
Online Access: | http://hdl.handle.net/10945/5628 |
Summary: | Approved for public release; distribution is unlimited. === Automated systems perform functions that were previously executed by a human. When using automation, the role of the human changes from operator to supervisor. For effective operation, the human must appropriately calibrate trust in the automated system. Improper trust leads to misuse and disuse of the system. The responsibilities of an automated system can be described by its level of automation. This study examined the effect of varying levels of automation and accuracy on trust calibration. Thirty participants were divided into three groups based on the system's level of automation and provided with an automated identification system. Within the Virtual Battlespace 2 environment, participants controlled the video feed of an unmanned aircraft while they identified friendly and enemy personnel on the ground. Results indicate a significant difference in the ability to correctly identify targets between levels of automation and accuracy. Participants exhibited better calibration at the management by consent level of automation and at the lower accuracy level. These findings demonstrate the necessity of continued research in the field of automation trust. |
---|