Tyler Gates 2019-04-09T14:25:30+00:00

Tyler Gates

Register

Download Presentation

Tyler Gates, Managing Principal, Brightline Interactive

Tyler is the Managing Principal at Brightline Interactive and President of the DC Chapter of the VR/AR Association. He is a visionary for the applications of VR/AR technology for training, enterprise and branding solutions. His technological prowess in conjunction with his aptitude for behavioral psychology enables him to transform client needs into tangible concepts through the use of immersive technology. He has adapted Brightline’s award-winning, highly immersive and interactive technology to create unique solutions for customers ranging from government agencies to commercial brands. Through the VR/AR Association, Tyler leads efforts to further the growth of VR/AR technology.

ABSTRACT
The Immersive Ecosystem: The Answer to Readiness

This paper examines the feasibility and effectiveness of artificial intelligence and sensor-driven virtual reality scenarios to assess for and to facilitate more personalized training for the warfighter.

Until recently, technologies have operated largely in siloes relative to their ability to interact with, connect and gather data from the individual. With the emergence of edge computing and immersive technology, we can now link these different tools for a full system of connectivity that drives personalization and efficiencies within a training environment. In addition, with the military’s increasing desire to monitor and assess cognitive status during critical missions, we introduce psychological protocols such as “affect labeling”. The efficacy and underpinning mechanisms of this technique have been established at the neuroscience level in the laboratory using functional near-infrared spectroscopy (fNIRS) techniques in clinical studies. The introduction of virtual reality (VR) allows for the integration of edge computing and affect labeling to analyze and visualize training data in a more effective way using immersive, realistic, and simulated environments – and all in real-time. In short, in what we call “The Immersive Ecosystem”, we can use photorealistically-rendered VR to create a virtual platform to assess and more personally and efficiently train the warfighter.

The Immersive Ecosystem

  1. Collect Data – An initial assessment and data collection occurs through sensor integration and edge computing.
  2. Store Data – After the trainee undergoes the initial assessment, a User Performance Profile (UPP) is created and stored on either a localized or cloud-based database. The UPP will be updated continuously as the trainee progresses through training.
  3. Interpret Data – Artificial Intelligence (AI) and machine learning are the underlying enablers of the database, intelligently analyzing and interpreting the performance data as users interact with the training system.
  4. Visualize Data – The analysis is visualized through mediums such as virtual and augmented reality platforms in which the individualized performance data manipulates the visual circumstance to mimic life-like consequential scenarios. This allows for continuous, tailored training based off trainee-specific proficiencies and deficiencies.
  5. Measure Data – With the incorporation of fNIRS sensors, eye tracking, heat mapping, decision matrices, and other sensor-integration tracking methods, we can track highly-personalized performance data and feed it back into the immersive feedback loop, continuously building a highly-tailored performance profile.

Experts in sensor-integrated virtual reality training environments, data science, and behavioural neuroscience are working together to prove this concept for a better trained warfighter. This study will be significant to warfighter training in the following ways:

  • To help both the instructor and trainee understand the cognitive status of individuals during decision-making in simulated scenarios.
  • To tailor individual training by allowing the virtual environment to react to emotional and decision cues to simulate a real-life consequential scenarios
  • To build highly tailored performance profile systems that are specific to the individual trainee and will continue to learn and adapt to the trainee’s behavior.

This paper concludes with an outline of some future possibilities with database-driven VR in training environments.