711 HPW demonstrates value of collaborative training research at I/ITSEC

  • Published
  • By John Schutte
  • 711th Human Performance Wing
Featuring technology developed through collaborations with private industry, the Air Force Research Laboratory, 711th Human Performance Wing, Human Effectiveness Directorate, or 711 HPW/RH, demonstrated the value of interoperable game-based environments for military training and simulation at the 2009 Interservice/Industry Training, Simulation & Education Conference held November 30 - December 3 in Orlando.

An ongoing research activity for the 711 HPW/RH's Warfighter Readiness Research Division, RHA, is the development of interactive gateways with interface controls and exchanges that permit commercial game-based environments to function seamlessly with each other and with RHA's Live-Virtual-Constructive training concept.

Using technology transfer vehicles such as a Cooperative Research and Development Agreement enables Air Force researchers to leverage existing commercial video-gaming software and tools. This collaborative approach can cut significant time and costs for development of new military training capabilities, and result in an improved commercial product as well.

"By developing these collaborative approaches, we can very rapidly identify existing relevant environments that might potentially fit a training requirement, integrate them, and evaluate new capabilities while leaving each environment's underlying software structure unchanged," said Dr. Winston "Wink" Bennett, RHA training and assessment research technical advisor.

RHA's demonstration shows that integrating existing technologies can reduce development time while increasing the training value of each technology as part of a larger and more integrated "family-of-complementary-trainers" enterprise, Dr. Bennett said.

"The capabilities we're showcasing this year were integrated in less than three months to test our rapid development and prototyping process," Dr. Bennett said.

Several commercially available software tools such as Aptima, Inc.'s Distributed Dynamic Decision-Making synthetic task environment developer and the Blue Box HD™ tool suite developed by L3 Communications Link Simulation and Training were leveraged under CRADAs to create a new training component.

In the new training capability, the software packages blend seamlessly even though each was developed independently as a stand-alone software environment.

The Mesa, Arizona-based RHA team also showcased a voice-enabled synthetic agent system interoperability with the Joint Technology Center/Systems Integration Laboratory's Air Force Synthetic Environment for Reconnaissance and Surveillance/Multiple Unified Simulation Environment unmanned aerial systems simulation.

The agent technology is called the Virtual Interactive Pattern Environment and Radiocomms Simulator. It uses cognitive agents--software pieces that can behave intelligently in complex systems over extended time periods--and speech interaction in concert with desktop simulation to provide "pattern-in-a-box" practice tools.

Developed in collaboration with Air Education and Training Command and CHI Systems, Inc., VIPERS allows student pilots to practice radio communications using a simulated pattern environment, artificially intelligent agents, and voice commands. CHI Systems specializes in research and development that makes computer-based systems more lifelike and easier to use.

Using synthetic agents to communicate with a trainee in a computer-based virtual world also reduces the need for human trainers and expensive systems such as aircraft and, in the case of air traffic controllers, control towers, resulting in drastically reduced costs and lower risks, according to John Paulus, software developer with CHI Systems.

"The more synthetic agents you have in a system the less you have to rely on live humans for training," Mr. Paulus said.

Joint tactical air controllers who serve as subject matter experts on the RHA ream showed how communications capabilities (enabled via the agents) can be used to direct and redirect a UAS to different targets. Using the system, JTACs can learn and practice real-time mission skills and effective coordination with UAS crews, including spoken and text interaction--however, the UAS operators in this case are voice-enabled software agents.

Dr. Bennett said integration of these two technologies, which had not previously been accomplished, provides new training capabilities and enhancements beyond the existing training for UAS operators and ground controllers.

"By leveraging these technologies we are ultimately helping to better prepare the warfighter to accomplish their mission," added 1st Lt. Omayra Genao, technical lead for the I/ITSEC demonstrations.

A third RHA demonstration featured a multimedia presentation of a game-based approach to space training called the Standard Space Trainer or SST.

The SST is a major training transition success developed in collaboration with the Space and Missiles Center and Air Force Space Command headquarters, along with other RHA training research partners in academia, industry and other services.