Improving Fluency in Narrative Text Generation With Grammatical Transformations and Probabilistic Parsing (bibtex)
by Emily Ahn, Fabrizio Morbini, Andrew S. Gordon
Abstract:
In research on automatic generation of narrative text, story events are often formally represented as a causal graph. When serializing and realizing this causal graph as natural language text, simple approaches produce cumbersome sentences with repetitive syntactic structure, e.g. long chains of “because” clauses. In our research, we show that the fluency of narrative text generated from causal graphs can be improved by applying rule-based grammatical transformations to generate many sentence variations with equivalent semantics, then selecting the variation that has the highest probability using a probabilistic syntactic parser. We evaluate our approach by generating narrative text from causal graphs that encode 100 brief stories involving the same three characters, based on a classic film of experimental social psychology. Crowdsourced workers judged the writing quality of texts generated with ranked transformations as significantly higher than those without, and not significantly lower than human-authored narratives of the same situations.
Reference:
Improving Fluency in Narrative Text Generation With Grammatical Transformations and Probabilistic Parsing (Emily Ahn, Fabrizio Morbini, Andrew S. Gordon), In Proceedings of the 9th International Natural Language Generation Conference (INLG-2016), 2016.
Bibtex Entry:
@inproceedings{ahn_improving_2016,
	address = {Edinburgh, UK},
	title = {Improving {Fluency} in {Narrative} {Text} {Generation} {With} {Grammatical} {Transformations} and {Probabilistic} {Parsing}},
	url = {https://www.researchgate.net/publication/307512031_Improving_Fluency_in_Narrative_Text_Generation_With_Grammatical_Transformations_and_Probabilistic_Parsing},
	abstract = {In research on automatic generation of narrative text, story events are often formally represented as a causal graph. When serializing and realizing this causal graph as natural language text, simple approaches produce cumbersome sentences with repetitive syntactic structure, e.g. long chains of “because” clauses. In our research, we show that the fluency of narrative text generated from causal graphs can be improved by applying rule-based grammatical transformations to generate many sentence variations with equivalent semantics, then selecting the variation that has the highest probability using a probabilistic syntactic parser. We evaluate our approach by generating narrative text from causal graphs that encode 100 brief stories involving the same three characters, based on a classic film of experimental social psychology. Crowdsourced workers judged the writing quality of texts generated with ranked transformations as significantly higher than those without, and not significantly lower than human-authored narratives
of the same situations.},
	booktitle = {Proceedings of the 9th {International} {Natural} {Language} {Generation} {Conference} ({INLG}-2016)},
	author = {Ahn, Emily and Morbini, Fabrizio and Gordon, Andrew S.},
	month = sep,
	year = {2016},
	keywords = {Narrative, Virtual Humans}
}
Powered by bibtexbrowser