Social decisions and fairness change when people’s interests are represented by autonomous agents (bibtex)
by Celso M. de Melo, Stacy Marsella, Jonathan Gratch
Abstract:
There has been growing interest on agents that represent people’s interests or act on their behalf such as automated negotiators, self-driving cars, or drones. Even though people will interact often with others via these agent representatives, little is known about whether people’s behavior changes when acting through these agents, when compared to direct interaction with others. Here we show that people’s decisions will change in important ways because of these agents; specifically, we showed that interacting via agents is likely to lead people to behave more fairly, when compared to direct interaction with others. We argue this occurs because programming an agent leads people to adopt a broader perspective, consider the other side’s position, and rely on social norms—such as fairness—to guide their decision making. To support this argument, we present four experiments: in Experiment 1 we show that people made fairer offers in the ultimatum and impunity games when interacting via agent representatives, when compared to direct interaction; in Experiment 2, participants were less likely to accept unfair offers in these games when agent representatives were involved; in Experiment 3, we show that the act of thinking about the decisions ahead of time—i.e., under the so-called “strategy method”—can also lead to increased fairness, even when no agents are involved; and, finally, in Experiment 4 we show that participants were less likely to reach an agreement with unfair counterparts in a negotiation setting.We discuss theoretical implications for our understanding of the nature of people’s social behavior with agent representatives, as well as practical implications for the design of agents that have the potential to increase fairness in society.
Reference:
Social decisions and fairness change when people’s interests are represented by autonomous agents (Celso M. de Melo, Stacy Marsella, Jonathan Gratch), In Autonomous Agents and Multi-Agent Systems, 2017.
Bibtex Entry:
@article{de_melo_social_2017,
	title = {Social decisions and fairness change when people’s interests are represented by autonomous agents},
	issn = {1387-2532, 1573-7454},
	url = {http://link.springer.com/10.1007/s10458-017-9376-6},
	doi = {10.1007/s10458-017-9376-6},
	abstract = {There has been growing interest on agents that represent people’s interests or act on their behalf such as automated negotiators, self-driving cars, or drones. Even though people will interact often with others via these agent representatives, little is known about whether people’s behavior changes when acting through these agents, when compared to direct interaction with others. Here we show that people’s decisions will change in important ways because of these agents; specifically, we showed that interacting via agents is likely to lead people to behave more fairly, when compared to direct interaction with others. We argue this occurs because programming an agent leads people to adopt a broader perspective, consider the other side’s position, and rely on social norms—such as fairness—to guide their decision making. To support this argument, we present four experiments: in Experiment 1 we show that people made fairer offers in the ultimatum and impunity games when interacting via agent representatives, when compared to direct interaction; in Experiment 2, participants were less likely to accept unfair offers in these games when agent representatives were involved; in Experiment 3, we show that the act of thinking about the decisions ahead of time—i.e., under the so-called “strategy method”—can also lead to increased fairness, even when no agents are involved; and, finally, in Experiment 4 we show that participants were less likely to reach an agreement with unfair counterparts in a negotiation setting.We discuss theoretical implications for our understanding of the nature of people’s social behavior with agent representatives, as well as practical implications for the design of agents that have the potential to increase fairness in society.},
	journal = {Autonomous Agents and Multi-Agent Systems},
	author = {de Melo, Celso M. and Marsella, Stacy and Gratch, Jonathan},
	month = jul,
	year = {2017},
	keywords = {ARL, DoD, Virtual Humans},
	pages = {163--187}
}
Powered by bibtexbrowser