Physiological and behavioral correlates of babies’ social engagement with robot and virtual human artificial intelligence agents (bibtex)
by Barbara Manini, Katherine Tsui, Adam Stone, Brian Scassellati, David Traum, Arcangelo Merla, Laura Ann Petitto
Abstract:
Exposure to the patterns of natural language in early life—especially in ways that are rich in socially contingent interaction and conversation—is among the most powerful facilitators of the human language acquisition process (Petitto et al., 2016). Adults’ infant-directed language (e.g., simple rhythmic nursery rhymes), communicated in social interactions with joint attention, supports babies’ biological predisposition to language development in the first year of life (Brook & Meltzoff, 2015). Yet many babies have minimal language exposure in early life that can have devastating consequences for their language learning and reading success—such as the deaf baby. With the aim to develop a learning tool for babies deprived of natural language input during sensitive periods in human development, we studied whether artificial intelligent agents (social robots and virtual humans) can serve as an augmentative communicative partner in early infancy. Using innovative thermal IR imaging technology, we recorded, imaged, and analyzed infants’ emotional arousal and behavioral responses during social interactions with a robot and virtual human, as compared with a real human. We asked whether babies’ physiological and behavioral responses of joint attention during these robot and virtual human interactions were similar to or different from interactions with a real human. We hypothesized that if babyartificial agent emotional arousal measures were observed to be similar to humans, then artificial agents may potentially serve as a promising tool in facilitating language learning in infants with early-life minimal language exposure. Methods: 10 hearing (nonsigning) infants (five 6-9mths; five 9-12mths). Following Meltzoff et al. (2010), after a brief familiarization period with the robot, infants participated in 6 10 episodes of robot head and eye gaze turning (left or right). Two screens were placed on each side of the robot, rendering it “looking at the screen” when it turned its head. Contiguous with the robot’s gaze/head, both screens showed a nursery rhyme in ASL, performed alternatively by a virtual human or a real human (held constant: physical features and linguistic content). Results: Time-locked/integrated infant behavior and thermal responses were analyzed (c.f., Merla, 2004; Manini et al., 2013). (1) Behavioral data showed babies followed robot gaze, yet the Thermal IR data added new insights: Significant increase in nasal-tip temperature was observed, indicative of suppression of the sympathetic activity and increase of parasympathetic/pro-social attentiveness. (2) Thermal responses with virtual human vs real human revealed a phasic decrease of temperature likely associated with increased vigilance and higher cognitive attention processes (e.g., match-mismatch analysis). Discussion: Robots and virtual humans may be effective as augmentative communicative partners for young babies. Novel here, we observed an integrated physiological and behavioral response of joint attention and social engagement during babies’ interaction with the robot. Moreover, the virtual human elicited a peaked attentional arousal reaction, which may be indicative of linguistic stimuli detection and/or a “readiness to learn.” The integration of physiological and behavioral responses provide insights that pave the way for groundbreaking applications in the field of artificial intelligence (Merla, 2014) and augmentative learning tools that promote language acquisition in young children.
Reference:
Physiological and behavioral correlates of babies’ social engagement with robot and virtual human artificial intelligence agents (Barbara Manini, Katherine Tsui, Adam Stone, Brian Scassellati, David Traum, Arcangelo Merla, Laura Ann Petitto), In Proceedings of SRCD, 2017.
Bibtex Entry:
@inproceedings{manini_physiological_2017,
	address = {Austin, TX},
	title = {Physiological and behavioral correlates of babies’ social engagement with robot and virtual human artificial intelligence agents},
	url = {https://www.researchgate.net/publication/316167858_Physiological_and_behavioral_correlates_of_babies'_social_engagement_with_robot_and_virtual_human_artificial_intelligence_agents},
	abstract = {Exposure to the patterns of natural language in early life—especially in ways that are rich
in socially contingent interaction and conversation—is among the most powerful facilitators of
the human language acquisition process (Petitto et al., 2016). Adults’ infant-directed language
(e.g., simple rhythmic nursery rhymes), communicated in social interactions with joint attention,
supports babies’ biological predisposition to language development in the first year of life
(Brook \& Meltzoff, 2015). Yet many babies have minimal language exposure in early life that
can have devastating consequences for their language learning and reading success—such as the
deaf baby. With the aim to develop a learning tool for babies deprived of natural language input
during sensitive periods in human development, we studied whether artificial intelligent agents
(social robots and virtual humans) can serve as an augmentative communicative partner in early
infancy. Using innovative thermal IR imaging technology, we recorded, imaged, and analyzed
infants’ emotional arousal and behavioral responses during social interactions with a robot and
virtual human, as compared with a real human. We asked whether babies’ physiological and
behavioral responses of joint attention during these robot and virtual human interactions were
similar to or different from interactions with a real human. We hypothesized that if babyartificial
agent emotional arousal measures were observed to be similar to humans, then artificial
agents may potentially serve as a promising tool in facilitating language learning in infants with
early-life minimal language exposure.
Methods: 10 hearing (nonsigning) infants (five 6-9mths; five 9-12mths). Following
Meltzoff et al. (2010), after a brief familiarization period with the robot, infants participated in 6
10
episodes of robot head and eye gaze turning (left or right). Two screens were placed on each side
of the robot, rendering it “looking at the screen” when it turned its head. Contiguous with the
robot’s gaze/head, both screens showed a nursery rhyme in ASL, performed alternatively by a
virtual human or a real human (held constant: physical features and linguistic content).
Results: Time-locked/integrated infant behavior and thermal responses were analyzed
(c.f., Merla, 2004; Manini et al., 2013). (1) Behavioral data showed babies followed robot gaze,
yet the Thermal IR data added new insights: Significant increase in nasal-tip temperature was
observed, indicative of suppression of the sympathetic activity and increase of
parasympathetic/pro-social attentiveness. (2) Thermal responses with virtual human vs real
human revealed a phasic decrease of temperature likely associated with increased vigilance and
higher cognitive attention processes (e.g., match-mismatch analysis).
Discussion: Robots and virtual humans may be effective as augmentative communicative
partners for young babies. Novel here, we observed an integrated physiological and behavioral
response of joint attention and social engagement during babies’ interaction with the robot.
Moreover, the virtual human elicited a peaked attentional arousal reaction, which may be
indicative of linguistic stimuli detection and/or a “readiness to learn.” The integration of
physiological and behavioral responses provide insights that pave the way for groundbreaking
applications in the field of artificial intelligence (Merla, 2014) and augmentative learning tools
that promote language acquisition in young children.},
	booktitle = {Proceedings of {SRCD}},
	author = {Manini, Barbara and Tsui, Katherine and Stone, Adam and Scassellati, Brian and Traum, David and Merla, Arcangelo and Petitto, Laura Ann},
	month = apr,
	year = {2017},
	keywords = {Virtual Humans}
}
Powered by bibtexbrowser