Multimodal Analysis and Estimation of Intimate Self-Disclosure (bibtex)
by Soleymani, Mohammad, Stefanov, Kalin, Kang, Sin-Hwa, Ondras, Jan and Gratch, Jonathan
Abstract:
Self-disclosure to others has a proven benefit for one’s mental health. It is shown that disclosure to computers can be similarly beneficial for emotional and psychological well-being. In this paper, we analyzed verbal and nonverbal behavior associated with self-disclosure in two datasets containing structured human-human and human-agent interviews from more than 200 participants. Correlation analysis of verbal and nonverbal behavior revealed that linguistic features such as affective and cognitive content in verbal behavior, and nonverbal behavior such as head gestures are associated with intimate self-disclosure. A multimodal deep neural network was developed to automatically estimate the level of intimate self-disclosure from verbal and nonverbal behavior. Between modalities, verbal behavior was the best modality for estimating self-disclosure within-corpora achieving r = 0.66. However, the cross-corpus evaluation demonstrated that nonverbal behavior can outperform language modality in cross-corpus evaluation. Such automatic models can be deployed in interactive virtual agents or social robots to evaluate rapport and guide their conversational strategy.
Reference:
Multimodal Analysis and Estimation of Intimate Self-Disclosure (Soleymani, Mohammad, Stefanov, Kalin, Kang, Sin-Hwa, Ondras, Jan and Gratch, Jonathan), In Proceedings of the 2019 International Conference on Multimodal Interaction on - ICMI '19, ACM Press, 2019.
Bibtex Entry:
@inproceedings{soleymani_multimodal_2019,
	address = {Suzhou, China},
	title = {Multimodal {Analysis} and {Estimation} of {Intimate} {Self}-{Disclosure}},
	isbn = {978-1-4503-6860-5},
	url = {http://dl.acm.org/citation.cfm?doid=3340555.3353737},
	doi = {10.1145/3340555.3353737},
	abstract = {Self-disclosure to others has a proven benefit for one’s mental health. It is shown that disclosure to computers can be similarly beneficial for emotional and psychological well-being. In this paper, we analyzed verbal and nonverbal behavior associated with self-disclosure in two datasets containing structured human-human and human-agent interviews from more than 200 participants. Correlation analysis of verbal and nonverbal behavior revealed that linguistic features such as affective and cognitive content in verbal behavior, and nonverbal behavior such as head gestures are associated with intimate self-disclosure. A multimodal deep neural network was developed to automatically estimate the level of intimate self-disclosure from verbal and nonverbal behavior. Between modalities, verbal behavior was the best modality for estimating self-disclosure within-corpora achieving r = 0.66. However, the cross-corpus evaluation demonstrated that nonverbal behavior can outperform language modality in cross-corpus evaluation. Such automatic models can be deployed in interactive virtual agents or social robots to evaluate rapport and guide their conversational strategy.},
	booktitle = {Proceedings of the 2019 {International} {Conference} on {Multimodal} {Interaction} on   - {ICMI} '19},
	publisher = {ACM Press},
	author = {Soleymani, Mohammad and Stefanov, Kalin and Kang, Sin-Hwa and Ondras, Jan and Gratch, Jonathan},
	month = oct,
	year = {2019},
	keywords = {MxR, UARC, Virtual Humans},
	pages = {59--68}
}
Powered by bibtexbrowser