Are human-like robots trusted like humans? An investigation into the effect of anthropomorphism on trust in robots measured by expected value as reflected by feedback related negativity and P300
dc.contributor.author | Wilson, L.O. | |
dc.date.accessioned | 2023-12-22T15:19:24Z | |
dc.date.available | 2023-12-22T15:19:24Z | |
dc.date.issued | 2023 | |
dc.identifier.citation |
Wilson, L.O. (2023) 'Are human-like robots trusted like humans? An investigation into the effect of anthropomorphism on trust in robots measured by expected value as reflected by feedback related negativity and P300', The Plymouth Student Scientist, 16(2), pp. 347-376. | en_US |
dc.identifier.uri | https://pearl.plymouth.ac.uk/handle/10026.1/21834 | |
dc.description.abstract |
Robots are becoming more prevalently used in industry and society. However, in order to ensure effective use of the trust, must be calibrated correctly. Anthropomorphism is one factors which is important in trust in robots (Hancock et al., 2011). Questionnaires and investment games have been used to investigate the impact of anthropomorphism on trust, however, these methods have led to disparate findings. Neurophysiological methods have also been used as an implicit measure of trust. Feedback related negativity (FRN) and P300 are event related potential (ERP) components which have been associated with processes involved in trust such as outcome evaluation. This study uses the trust game (Berg et al., 1995), along with questionnaires and ERP data to investigate trust and expectations towards three agents varying in anthropomorphism, a human, an anthropomorphic robot, and a computer. The behavioural and self-reported findings suggest that the human is perceived as the most trustworthy and there is no difference between the robot and the computer. The ERP data revealed a robot driven difference in FRN and P300 activation, which suggests that robots violated expectations more so than a human or a computer. The present findings are explained in terms of the perfect automation schema and trustworthiness and dominance perceptions. Future research into the impact of voice pitch on dominance and trustworthiness and the impact of trust violations is suggested in order to gain a more holistic picture of the impact of anthropomorphism on trust. | en_US |
dc.language.iso | en | en_US |
dc.publisher | University of Plymouth | en_US |
dc.rights | Attribution 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by/3.0/us/ | * |
dc.subject | Trust | en_US |
dc.subject | Anthropomorphism | en_US |
dc.subject | Robots | en_US |
dc.subject | EEG | en_US |
dc.subject | FRN | en_US |
dc.subject | ERP | en_US |
dc.subject | Trustworthiness | en_US |
dc.subject | Likeability | en_US |
dc.subject | Dominance | en_US |
dc.subject | Intelligence | en_US |
dc.subject | Investment Game | en_US |
dc.subject | Trust Game | en_US |
dc.subject | Trust Violation | en_US |
dc.title | Are human-like robots trusted like humans? An investigation into the effect of anthropomorphism on trust in robots measured by expected value as reflected by feedback related negativity and P300 | en_US |
dc.type | Article | en_US |
plymouth.issue | 2 | |
plymouth.volume | 16 | |
plymouth.journal | The Plymouth Student Scientist |