Establishing Metrics and Creating Standards: Quantifying Efficacy of Battlefield Simulations (bibtex)
by Davis, Dan M, Guizani, Skander and Jaksha, Evan
Abstract:
This paper asserts that quantification and verification of Battlefield simulations is necessary to assess, verify, and guide the researchers, military commanders, and users in both the simulations’ development and their implementation. The authors present their observations on previous development activities that were hampered by lack of effective metrics and present their arguments that much of this was driven by a lack of standards. Tracing back using commonly accepted System Engineering practices, they show how lack of such standards makes even to the development of effective metrics problematic. The paper documents the experiences and enumerates the potential pitfalls of these shortcomings. Both the authors' experiences in military service and the technical literature supporting their theses are adduced to support their analysis of the current technical research and development environment. Then the paper evaluates several System Engineering tools to further investigate and establish the ultimate goals of these formalized processes. Using their current project in establishing virtual on-line mentors as an exemplar of the way such tools would be effective, the authors make a case for the needs for metrics standards that both are accepted by consensus and are ultimately directed at providing the warfighter with all of the training possible before putting that warfighters in harm's way and imperiling the missions for which they are putting themselves at risk. Examples of the nature and reaction to simulator training, virtual human interaction, computer agent interfaces and implementation issues are given to further illuminate for the reader the possible extensions of these approaches into the reader's own research as well as calling for a more community-wide recognition of the needs for standards both for implementation and for metrics to assess Battlefield Simulation utility to the warfighter. Future investigations, analysis and action are considered and evaluated
Reference:
Establishing Metrics and Creating Standards: Quantifying Efficacy of Battlefield Simulations (Davis, Dan M, Guizani, Skander and Jaksha, Evan), In SISO Simulation Innovation Workshop, 2020.
Bibtex Entry:
@article{davis_establishing_2020,
	title = {Establishing {Metrics} and {Creating} {Standards}: {Quantifying} {Efficacy} of {Battlefield} {Simulations}},
	url = {https://www.sisostds.org/Default.aspx?tabid=105&EntryId=51197},
	abstract = {This paper asserts that quantification and verification of Battlefield simulations is necessary to assess, verify, and guide the researchers, military commanders, and users in both the simulations’ development and their implementation. The authors present their observations on previous development activities that were hampered by lack of effective metrics and present their arguments that much of this was driven by a lack of standards. Tracing back using commonly accepted System Engineering practices, they show how lack of such standards makes even to the development of effective metrics problematic. The paper documents the experiences and enumerates the potential pitfalls of these shortcomings. Both the authors' experiences in military service and the technical literature supporting their theses are adduced to support their analysis of the current technical research and development environment. Then the paper evaluates several System Engineering tools to further investigate and establish the ultimate goals of these formalized processes. Using their current project in establishing virtual on-line mentors as an exemplar of the way such tools would be effective, the authors make a case for the needs for metrics standards that both are accepted by consensus and are ultimately directed at providing the warfighter with all of the training possible before putting that warfighters in harm's way and imperiling the missions for which they are putting themselves at risk. Examples of the nature and reaction to simulator training, virtual human interaction, computer agent interfaces and implementation issues are given to further illuminate for the reader the possible extensions of these approaches into the reader's own research as well as calling for a more community-wide recognition of the needs for standards both for implementation and for metrics to assess Battlefield Simulation utility to the warfighter. Future investigations, analysis and action are considered and evaluated},
	number = {2020\_SIW\_52},
	journal = {SISO Simulation Innovation Workshop},
	author = {Davis, Dan M and Guizani, Skander and Jaksha, Evan},
	month = apr,
	year = {2020},
	keywords = {Learning Sciences, UARC},
	pages = {11}
}
Powered by bibtexbrowser