Background
This basic research project investigates a foreseeable extended reality (XR) capability enabling users to contribute key situational observations and annotate an environment, much like collaborators work on a shared document in a cloud-based platform. User-generated data pooled with additional existing intelligence will undoubtedly be evaluated by AI, machine learning algorithms and human analysts to uncover patterns, detect changes and anomalies, and infer contextual insights, which when communicated to in-field operators can lead to improved situational awareness and better decision-making.
Reaping the strategic advantages of this Augmented Reality (AR) capability requires research in both the input user-interfaces and output visualization modalities of world-fixed annotations of an intelligent mixed-reality system that embeds information, geo-specifically into an environment.
Objectives
The A2E2 project proposes that AR world-fixed information, (derived from collective intelligence encompassing, existing intel, crowdsourced information and user-generated observations) embedded geo-specifically into the environment compared to the conventional method of text and screen-based information delivery, will lead to improved performance by the downrange warfighter. Assessment of this capability will test its potential to increase situational awareness and understanding and improve decision making speed and quality.
Results
To date the MxR team has created a virtualized Augmented Reality test-bed environment to assess information dissemination between two conditions: a) a current paradigm: ATAK tablet-based-style heads-down-display and b) a future-imagined Heads up Display (HUD) computational ecosystem; where information is presented spatially within the environment.
Next Steps
The team is completing a between-subjects user study employing an emergency response scenario, and will begin conducting experiments with subjects in Q4 FY24. Effectiveness of this future imagined capability will be assessed based on various task measures, including speed of decision-making, quality of decisions, post-tests of situational awareness and sense-making, as well as self-reported usability and perceived performance.
Published academic research papers are available here. For more information Contact Us