Show simple item record

dc.contributor.authorWilson, L.O.
dc.date.accessioned2023-12-22T15:19:24Z
dc.date.available2023-12-22T15:19:24Z
dc.date.issued2023
dc.identifier.citation

Wilson, L.O. (2023) 'Are human-like robots trusted like humans? An investigation into the effect of anthropomorphism on trust in robots measured by expected value as reflected by feedback related negativity and P300', The Plymouth Student Scientist, 16(2), pp. 347-376.

en_US
dc.identifier.urihttps://pearl.plymouth.ac.uk/handle/10026.1/21834
dc.description.abstract

Robots are becoming more prevalently used in industry and society. However, in order to ensure effective use of the trust, must be calibrated correctly. Anthropomorphism is one factors which is important in trust in robots (Hancock et al., 2011). Questionnaires and investment games have been used to investigate the impact of anthropomorphism on trust, however, these methods have led to disparate findings. Neurophysiological methods have also been used as an implicit measure of trust. Feedback related negativity (FRN) and P300 are event related potential (ERP) components which have been associated with processes involved in trust such as outcome evaluation. This study uses the trust game (Berg et al., 1995), along with questionnaires and ERP data to investigate trust and expectations towards three agents varying in anthropomorphism, a human, an anthropomorphic robot, and a computer. The behavioural and self-reported findings suggest that the human is perceived as the most trustworthy and there is no difference between the robot and the computer. The ERP data revealed a robot driven difference in FRN and P300 activation, which suggests that robots violated expectations more so than a human or a computer. The present findings are explained in terms of the perfect automation schema and trustworthiness and dominance perceptions. Future research into the impact of voice pitch on dominance and trustworthiness and the impact of trust violations is suggested in order to gain a more holistic picture of the impact of anthropomorphism on trust.

en_US
dc.language.isoenen_US
dc.publisherUniversity of Plymouthen_US
dc.rightsAttribution 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/us/*
dc.subjectTrusten_US
dc.subjectAnthropomorphismen_US
dc.subjectRobotsen_US
dc.subjectEEGen_US
dc.subjectFRNen_US
dc.subjectERPen_US
dc.subjectTrustworthinessen_US
dc.subjectLikeabilityen_US
dc.subjectDominanceen_US
dc.subjectIntelligenceen_US
dc.subjectInvestment Gameen_US
dc.subjectTrust Gameen_US
dc.subjectTrust Violationen_US
dc.titleAre human-like robots trusted like humans? An investigation into the effect of anthropomorphism on trust in robots measured by expected value as reflected by feedback related negativity and P300en_US
dc.typeArticleen_US
plymouth.issue2
plymouth.volume16
plymouth.journalThe Plymouth Student Scientist


Files in this item

Thumbnail
Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 3.0 United States
Except where otherwise noted, this item's license is described as Attribution 3.0 United States

All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV