skip to main content
Language:
Search Limited to: Search Limited to: Resource type Show Results with: Show Results with: Search type Index

Enhancing Humans Trust in Robots through Explanations

Digital Resources/Online E-Resources

Citations Cited by
  • Title:
    Enhancing Humans Trust in Robots through Explanations
  • Author: Javaid, Misbah ; Estivill-Castro, Vladimir ; Hexel, Rene
  • Subjects: Humans ; Interactions ; Robots ; Trust
  • Description: Robots have moved away from manufacturing environments and are now deployed as social robots in human environments such as in hotels, shops, hospitals and as office coworkers. These robots complement human capabilities and skills with their own robotic skills. With the advancement in the technological capabilities of robots, the roles of such sophisticated robots are evolving from obedient deterministic machines to companions or teammates. Meanwhile, the role of humans is also changing from operators to that of team members. Therefore, robots are expected to collaborate and contribute productively with humans as teammates. We expect robots to develop social intelligence to behave smartly, and also assist us to perform complex tasks. Still, robots lack the features that would permit them to be considered full-fledge teammates by their human counterparts. Inadequacy of humans trust has been identified as a pre-eminent factor behind the unacceptability of robots as trustworthy teammates. Trust is an essential factor for accomplishing the full potential of human-robot teamwork. Trust directly affects a human's willingness to receive robot-produced information and suggestions, and hence, the future use of robots also depends on trust. If humans do not trust robots, they may not utilize their robotic features to their full potential. Research is on-going to address the establishment and endorsement of efficient and successful approaches for an extensive spectrum of Human-Robot Interaction issues. Pragmatic evaluations and investigations in the field of Human-Computer Interaction have already examined humans' trust in technical systems mostly on issues such as reliability and accuracy of performance. We hypothesize that to integrate robots into human-environment successfully, robots must make their decision-making transparent to the humans in the mixed human-robot team. We argue that the trust humans place in their robotic companions is influenced by the humans' achieving some understanding of the robot's decision-making process. We propose to achieve higher levels of trust in robots, by making the robots produce explanations in human understandable terms. Our thesis is that the explanations from robots shall express how a decision is made and why the decision-made is selected as best among all other decisions. By augmenting robots with explanation capabilities, we facilitate humans to comprehend the behaviour of robots and help in establishing successful and trustworthy human-robot interaction. Artificial intelligence researchers, within the area of expert systems, have also provided sufficient motivation to consider the contribution of explanations to building humans trust and to the acceptability of these systems. Also, systems that provide explanations after some failure received more tolerant behaviour from humans. Providing explanations for decisions is believed to be one of the most important capabilities of robots. However, to the best of our knowledge, there is still a gap in the current human-robot interaction literature. We notice there is very little experimental verification that could show that explanations facilitate and certainly affect humans trust and acceptance of robots. Previous research [1] used a different method to increase transparency by having a simulated robot to provide explanations of its actions. Explanations did not improve the team's performance and trust was identified as an influential factor only under the conditions of high-reliability. To better comprehend the emerging topic of trust, we adopted the human-in-the-loop approach, by providing clear explanations, with emphasis on the transparency and justification of the robot decisions. We report on two user studies investigating the effect of a robot's explanations with different modalities (text and audio) on the humans' level of trust during human-robot physical interactions. For user study 1, our setting consists of an interactive game-playing environment (the partial information game Domino), in which the robot partners with a human to form a team. Since in the game there are two adversarial teams, the robot plays two roles: the already mentioned partner with a human in a team, but also as an adversary facing the second team of two humans. Explanations from the partner robot not only provide insight into the robot's decision-making process, but also help in improving humans' learning of the task. We evaluated the human participants' implicit trust in the robot by performing multi-modal scrutiny i.e., recording human participants' facial expressions, and affective states during the game-play sessions. We also used questionnaires to measure participants' explicit trust and perception of the robot attributes. Our results show that the human participants considered the robot with explanations' ability as a trustworthy team-mate. For user study 2, human participants performed a decision-making task in collaboration with a real robot. For the proposed method, we set the focus of our inquiry through humans' conformation and acceptance of the robot's answers, as a new objective measure of the human-robot trust relationship. We found that human participants trusted and conformed more with the robot's decisions (communicated with explanations), as compared to their own decisions. Meanwhile, subjective measures using questionnaires also reported an increase in trust of human participants towards the robot. Through our experimental investigations, we conclude that explanations can be generally used as an effective communication modality for robots to earn human trust in social environments. Thesis (PhD Doctorate) Doctor of Philosophy (PhD) School of Info & Comm Tech Science, Environment, Engineering and Technology Full Text Source: TROVE
  • Creation Date: 2021
  • Language: English
  • Source: Trove Australian Thesis (Full Text Open Access)

Searching Remote Databases, Please Wait