WASHINGTON — The U.S. Army is driving realism into virtual training to enhance effectiveness, but it’s not an easy task — even for the gaming industry.

One of the challenges the service is facing as it embarks on developing a Synthetic Training Environment is “providing a realistic and immersive virtual training experience” that portrays “computer-generated people and objects behind real things and doing so in real time from multiple perspectives as actors and objects move around in the environment,” according to an Army statement released ahead of the Interservice/ Industry Training, Simulation and Education Conference in Orlando, Florida.

This “conundrum,” as the Army puts it, is improving “dynamic occlusion,” and the service is working with industry and academia to solve the issue.

The Army has been building a virtual world in which to train soldiers for war, and it awarded major contracts this year for reconfigurable virtual air and ground trainers as well as for a common synthetic environment that includes complex and real-life terrain.

Over the last two years, the components of the Synthetic Training Environment, or STE, have taken shape and will consist of One World Terrain — which compiles realistic and accurate virtual maps of territory — as well as training simulation software, a training management tool and virtual collective trainers. All of this will make up the soldier/squad virtual trainer and the reconfigurable virtual collective trainer.

The dynamic occlusion issue is one with which video gamers are well-acquainted. “When virtual projections within a player’s view of the world are not layered appropriately with real-world objects, the experience feels unnatural,” which is an undesirable attribute for a realistic training experience, the statement noted.

To achieve realism, the system must be able to sense dynamic changes to the mission environment, updating 3D terrain pictures — or meshes — in real time.

According to the Army, “in military scenarios, the problem can adversely affect the learning experience or lead to negative habit transfer if a soldier can’t realistically take cover or if a vehicle crew is hindered from accurately aiming and firing on an enemy position."

The Army plans to mature and demonstrate augmented reality algorithms and techniques to occlude real or virtual dynamic objects in realistic, changing environments, the Army said in a statement to Defense News. “Occlusion of live, moving objects is challenging — doing so at long distances is even more so,” the Army said.

The service’s augmented-reality, head-mounted displays for dismounted soldiers are limited to small, indoor environments because the hardware limits the ability to sense the world around them “at a meaningful distance,” the Army said.

To make it work, a sensor must register and a computer must “see and understand” the live environment including changes. This allows for realistic placement of computer-generated holographic content, according to the service.

The Soldier Center of Combat Capabilities Development Command identified and evaluated over the last year several commercial, off-the-shelf sensors that provide dynamic occlusion capability. The center developed a prototype of a networked camera technique that achieves greater occlusion accuracy at greater distances.

The service anticipates dynamic occlusion range and accuracy to improve over the next year using the developed techniques and algorithms.

There are still fundamental breakthroughs in methods for large-area tracking and in dynamic occlusion algorithms — particularly algorithm optimization for weapon tracking — on the horizon, the Army noted.

More advancement is also needed to achieve the extreme simultaneous localization and mapping, or SLAM, capabilities needed to dynamically occlude synthetic entities not available on the commercial market, the Army said. SLAM is the capability to construct and update maps of unknown environments while simultaneously keeping track of an agent’s location within it.

And while the Army works to bring more realism into its virtual constructive environments, achieving more realism in live training is challenging enough. The service is trying to find alternatives to its Instrumentable-Multiple Integrated Laser Engagement System, or I-MILES, a system that was developed in the 1970s and 1980s for live force-on-force and force-on-target training at Army training locations around the world.

“Although I-MILES has seen enhancements over the years, laser-based systems inherently introduce artificialities into live exercises because of their limited ability to realistically represent lethal effects,” the Army statement described. “A shrub or a cardboard box, for example, provides effective cover from a laser hit but would be useless in a firefight.”

The Army also wants to more accurately depict the effects of direct and indirect fire and train on more emerging longer-range or more sophisticated weapons that are difficult and expensive to depict in live training, according to the service’s statement.

“Our goal with Live is to better replicate the lethality, vulnerability and effects of actual live-fire engagements at all of our Army training centers,” Maj. Gen. Maria Gervais, who is in charge of developing the STE, said in the statement. “Simultaneously, the consequences of all the actions and the weapons systems in use must be accurately depicted in the virtual environment so soldiers training via simulation at other locations will have the same operational picture in real time.”

Jen Judson is an award-winning journalist covering land warfare for Defense News. She has also worked for Politico and Inside Defense. She holds a Master of Science degree in journalism from Boston University and a Bachelor of Arts degree from Kenyon College.

Share:
More In ITSEC