To Tell the Truth: Virtual Agents and Morning Morality (bibtex)
by Sharon Mozgai, Gale Lucas, Jonathan Gratch
Abstract:
This paper investigates the impact of time of day on truthfulness in human-agent interactions. Time of day has been found to have important implications for moral behavior in human-human interaction. Namely, the morning morality effect shows that people are more likely to act ethically (i.e., tell fewer lies) in the morning than in the afternoon. Based on previous work on disclosure and virtual agents, we propose that this effect will not bear out in human-agent interactions. Preliminary evaluation shows that individuals who lie when engaged in multi-issue bargaining tasks with the Conflict Resolution Agent, a semi-automated virtual human, tell more lies to human negotiation partners than virtual agent negotiation partners in the afternoon and are more likely to tell more lies in the afternoon than in the morning when they believe they are negotiating with a human. Time of day does not have a significant effect on the amount of lies told to the virtual agent during the multi-issue bargaining task.
Reference:
To Tell the Truth: Virtual Agents and Morning Morality (Sharon Mozgai, Gale Lucas, Jonathan Gratch), In Proceedings of the 17th International Conference on Intelligent Virtual Agents, Springer International Publishing, 2017.
Bibtex Entry:
@inproceedings{mozgai_tell_2017,
	address = {Stockholm, Sweden},
	title = {To {Tell} the {Truth}: {Virtual} {Agents} and {Morning} {Morality}},
	isbn = {978-3-319-67400-1 978-3-319-67401-8},
	url = {http://link.springer.com/10.1007/978-3-319-67401-8_37},
	doi = {10.1007/978-3-319-67401-8_37},
	abstract = {This paper investigates the impact of time of day on truthfulness in human-agent interactions. Time of day has been found to have important implications for moral behavior in human-human interaction. Namely, the morning morality effect shows that people are more likely to act ethically (i.e., tell fewer lies) in the morning than in the afternoon. Based on previous work on disclosure and virtual agents, we propose that this effect will not bear out in human-agent interactions. Preliminary evaluation shows that individuals who lie when engaged in multi-issue bargaining tasks with the Conflict Resolution Agent, a semi-automated virtual human, tell more lies to human negotiation partners than virtual agent negotiation partners in the afternoon and are more likely to tell more lies in the afternoon than in the morning when they believe they are negotiating with a human. Time of day does not have a significant effect on the amount of lies told to the virtual agent during the multi-issue bargaining task.},
	booktitle = {Proceedings of the 17th {International} {Conference} on {Intelligent} {Virtual} {Agents}},
	publisher = {Springer International Publishing},
	author = {Mozgai, Sharon and Lucas, Gale and Gratch, Jonathan},
	month = aug,
	year = {2017},
	keywords = {UARC, Virtual Humans},
	pages = {283--286}
}
Powered by bibtexbrowser