Engaging with the Scenario: Affect and Facial Patterns from a Scenario-Based Intelligent Tutoring System (bibtex)
by Benjamin D. Nye, Shamya Karumbaiah, S. Tugba Tokel, Mark G. Core, Giota Stratou, Daniel Auerbach, Kallirroi Georgila
Abstract:
Facial expression trackers output measures for facial action units (AUs), and are increasingly being used in learning technologies. In this paper, we compile patterns of AUs seen in related work as well as use factor analysis to search for categories implicit in our corpus. Although there was some overlap between the factors in our data and previous work, we also identified factors seen in the broader literature but not previously reported in the context of learning environments. In a correlational analysis, we found evidence for relationships between factors and self-reported traits such as academic effort, study habits, and interest in the subject. In addition, we saw differences in average levels of factors between a video watching activity, and a decision making activity. However, in this analysis, we were not able to isolate any facial expressions having a significant positive or negative relationship with either learning gain, or performance once question difficulty and related factors were also considered. Given the overall low levels of facial affect in the corpus, further research will explore different populations and learning tasks to test the possible hypothesis that learners may have been in a pattern of “Over-Flow” in which they were engaged with the system, but not deeply thinking about the content or their errors.
Reference:
Engaging with the Scenario: Affect and Facial Patterns from a Scenario-Based Intelligent Tutoring System (Benjamin D. Nye, Shamya Karumbaiah, S. Tugba Tokel, Mark G. Core, Giota Stratou, Daniel Auerbach, Kallirroi Georgila), In Proceeding of the International Conference on Artificial Intelligence in Education, Springer International Publishing, volume 10947, 2018.
Bibtex Entry:
@inproceedings{nye_engaging_2018,
	address = {London, UK},
	title = {Engaging with the {Scenario}: {Affect} and {Facial} {Patterns} from a {Scenario}-{Based} {Intelligent} {Tutoring} {System}},
	volume = {10947},
	isbn = {978-3-319-93842-4 978-3-319-93843-1},
	url = {http://link.springer.com/10.1007/978-3-319-93843-1_26},
	abstract = {Facial expression trackers output measures for facial action units (AUs), and are increasingly being used in learning technologies. In this paper, we compile patterns of AUs seen in related work as well as use factor analysis to search for categories implicit in our corpus. Although there was some overlap between the factors in our data and previous work, we also identified factors seen in the broader literature but not previously reported in the context of learning environments. In a correlational analysis, we found evidence for relationships between factors and self-reported traits such as academic effort, study habits, and interest in the subject. In addition, we saw differences in average levels of factors between a video watching activity, and a decision making activity. However, in this analysis, we were not able to isolate any facial expressions having a significant positive or negative relationship with either learning gain, or performance once question difficulty and related factors were also considered. Given the overall low levels of facial affect in the corpus, further research will explore different populations and learning tasks to test the possible hypothesis that learners may have been in a pattern of “Over-Flow” in which they were engaged with the system, but not deeply thinking about the content or their errors.},
	booktitle = {Proceeding of the {International} {Conference} on {Artificial} {Intelligence} in {Education}},
	publisher = {Springer International Publishing},
	author = {Nye, Benjamin D. and Karumbaiah, Shamya and Tokel, S. Tugba and Core, Mark G. and Stratou, Giota and Auerbach, Daniel and Georgila, Kallirroi},
	month = jun,
	year = {2018},
	doi = {10.1007/978-3-319-93843-1_26},
	keywords = {Learning Sciences, UARC, Virtual Humans},
	pages = {352--366}
}
Powered by bibtexbrowser