Mutual Behaviors during Dyadic Negotiation: Automatic Prediction of Respondent Reactions (bibtex)
by Park, Sunghyun, Scherer, Stefan, Gratch, Jonathan, Carnevale, Peter and Morency, Louis-Philippe
Abstract:
In this paper, we analyze face-to-face negotiation interactions with the goal of predicting the respondent’’s immediate reaction (i.e., accept or reject) to a negotiation offer. Supported by the theory of social rapport, we focus on mutual behaviors which are defined as nonverbal characteristics that occur due to interactional influence. These patterns include behavioral symmetry (e.g., synchronized smiles) as well as asymmetry (e.g., opposite postures) between the two negotiators. In addition, we put emphasis on finding audio- visual mutual behaviors that can be extracted automatically, with the vision of a real-time decision support tool. We introduce a dyadic negotiation dataset consisting of 42 face-to- face interactions and show experiments confirming the importance of multimodal and mutual behaviors.
Reference:
Mutual Behaviors during Dyadic Negotiation: Automatic Prediction of Respondent Reactions (Park, Sunghyun, Scherer, Stefan, Gratch, Jonathan, Carnevale, Peter and Morency, Louis-Philippe), In Affective Computing and Intelligent Interaction, 2013.
Bibtex Entry:
@inproceedings{park_mutual_2013,
	address = {Geneva, Switzerland},
	title = {Mutual {Behaviors} during {Dyadic} {Negotiation}: {Automatic} {Prediction} of {Respondent} {Reactions}},
	isbn = {978-0-7695-5048-0},
	url = {http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6681467},
	doi = {10.1109/ACII.2013.76},
	abstract = {In this paper, we analyze face-to-face negotiation interactions with the goal of predicting the respondent’’s immediate reaction (i.e., accept or reject) to a negotiation offer. Supported by the theory of social rapport, we focus on mutual behaviors which are defined as nonverbal characteristics that occur due to interactional influence. These patterns include behavioral symmetry (e.g., synchronized smiles) as well as asymmetry (e.g., opposite postures) between the two negotiators. In addition, we put emphasis on finding audio- visual mutual behaviors that can be extracted automatically, with the vision of a real-time decision support tool. We introduce a dyadic negotiation dataset consisting of 42 face-to- face interactions and show experiments confirming the importance of multimodal and mutual behaviors.},
	booktitle = {Affective {Computing} and {Intelligent} {Interaction}},
	author = {Park, Sunghyun and Scherer, Stefan and Gratch, Jonathan and Carnevale, Peter and Morency, Louis-Philippe},
	month = sep,
	year = {2013},
	keywords = {Virtual Humans, UARC},
	pages = {423--428}
}
Powered by bibtexbrowser