Abstract: Building trust is often cited as important for the success of a service or application. When part of the system is an embodied conversational agent (ECA), the design of the ECA has an impact on a user’s trust. In this paper we discuss whether designing an ECA for trust also means designing an ECA to give a false impression of sentience, whether such an implicit deception can undermine a sense of trust, and the impact such a design process may have on a vulnerable user group, in this case users living with dementia. We conclude by arguing that current trust metrics ignore the importance of a willing suspension of disbelief and its role in social computing.
Loading