Welcome to the Real World: How Agent Strategy Increases Human Willingness to Deceive (bibtex)
by Johnathan Mell, Gale M. Lucas, Jonathan Gratch
Abstract:
Humans that negotiate through representatives often instruct those representatives to act in certain ways that align with both the client's goals and his or her social norms. However, which tactics and ethical norms humans endorse vary widely from person to person, and these endorsements may be easy to manipulate. This work presents the results of a study that demonstrates that humans that interact with an artificial agent may change what kinds of tactics and norms they endorse-often dramatically. Previous work has indicated that people that negotiate through artificial agent representatives may be more inclined to fairness than those people that negotiate directly. Our work qualifies that initial picture, demonstrating that subsequent experience may change this tendency toward fairness. By exposing human negotiators to tough, automated agents, we are able to shift the participant's willingness to deceive others and utilize "hard-ball" negotiation techniques. In short, what techniques people decide to endorse is dependent upon their context and experience. We examine the effects of interacting with four different types of automated agents, each with a unique strategy, and how this subsequently changes which strategies a human negotiator might later endorse. In the study, which was conducted on an online negotiation platform, four different types of automated agents negotiate with humans over the course of a 10-minute interaction. The agents differ in a 2x2 design according to agent strategy (tough vs. fair) and agent attitude (nice vs. nasty). These results show that in this multi-issue bargaining task, humans that interacted with a tough agent were more willing to endorse deceptive techniques when instructing their own representative. These kinds of techniques were endorsed even if the agent the human encountered did not use deception as part of its strategy. In contrast to some previous work, there was not a significant effect of agent attitude. These results indicate the power of allowing people to program agents that follow their instructions, but also indicate that these social norms and tactic endorsements may be mutable in the presence of real negotiation experience.
Reference:
Welcome to the Real World: How Agent Strategy Increases Human Willingness to Deceive (Johnathan Mell, Gale M. Lucas, Jonathan Gratch), In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, International Foundation for Autonomous Agents and Multiagent Systems, 2018.
Bibtex Entry:
@inproceedings{mell_welcome_2018,
	address = {Stockholm, Sweden},
	title = {Welcome to the {Real} {World}: {How} {Agent} {Strategy} {Increases} {Human} {Willingness} to {Deceive}},
	url = {https://dl.acm.org/citation.cfm?id=3237884},
	abstract = {Humans that negotiate through representatives often instruct those representatives to act in certain ways that align with both the client's goals and his or her social norms. However, which tactics and ethical norms humans endorse vary widely from person to person, and these endorsements may be easy to manipulate. This work presents the results of a study that demonstrates that humans that interact with an artificial agent may change what kinds of tactics and norms they endorse-often dramatically. Previous work has indicated that people that negotiate through artificial agent representatives may be more inclined to fairness than those people that negotiate directly. Our work qualifies that initial picture, demonstrating that subsequent experience may change this tendency toward fairness. By exposing human negotiators to tough, automated agents, we are able to shift the participant's willingness to deceive others and utilize "hard-ball" negotiation techniques. In short, what techniques people decide to endorse is dependent upon their context and experience.

We examine the effects of interacting with four different types of automated agents, each with a unique strategy, and how this subsequently changes which strategies a human negotiator might later endorse. In the study, which was conducted on an online negotiation platform, four different types of automated agents negotiate with humans over the course of a 10-minute interaction. The agents differ in a 2x2 design according to agent strategy (tough vs. fair) and agent attitude (nice vs. nasty). These results show that in this multi-issue bargaining task, humans that interacted with a tough agent were more willing to endorse deceptive techniques when instructing their own representative. These kinds of techniques were endorsed even if the agent the human encountered did not use deception as part of its strategy. In contrast to some previous work, there was not a significant effect of agent attitude. These results indicate the power of allowing people to program agents that follow their instructions, but also indicate that these social norms and tactic endorsements may be mutable in the presence of real negotiation experience.},
	booktitle = {Proceedings of the 17th {International} {Conference} on {Autonomous} {Agents} and {MultiAgent} {Systems}},
	publisher = {International Foundation for Autonomous Agents and Multiagent Systems},
	author = {Mell, Johnathan and Lucas, Gale M. and Gratch, Jonathan},
	month = jul,
	year = {2018},
	keywords = {UARC, Virtual Humans},
	pages = {1250--1257}
}
Powered by bibtexbrowser