ROME — Military driving simulators are as effective as live training for basic tracked-vehicle maneuver skills, according to a study publicly presented for the first time today at ITEC.
A two-year joint project between industry and the UK’s Defence Science and Technology Laboratory culminated this year with data showing that students trained in actual vehicles and those trained in driving simulators had the same levels of success across all parameters measured, which included contact with poles or cones on a course, completion time, depth perception tests and ratings from an independent assessor.
Researchers condensed a four-week training course into one week and, after basic safety critical training on a static vehicle, divided the group of new drivers into two groups. Both groups were coached by the same instructors and learned simple maneuvers, such as depth perception of the tank (modeled on the Warrior), controlled braking and a forward maneuver.
Jessica Allen, a research psychologist from DSTL, said the researchers were surprised to find that students trained in the simulator performed as well as their live-trained counterparts. The research provides scientific data for moving simple driving training to simulation, information the researchers were unable to find other objective studies on. This is crucial both because the UK Ministry of Defence is considering adding more simulation to its portfolio and because of budget constraints that make simulators a cheaper option than live training.
“I think that we are all aware of the financial pressure we’re under,” Allen said.
In addition to showing trainee success, researchers also found that the driving instructors were initially skeptical of simulation but grew to accept and even embrace it as a teaching method.
The group was initially close-minded, and instructors didn’t think they would like simulation as much as they did, according to Simon Skinner, the managing director for XPI Simulation and leader of the industry side of the project. He and Allen noted that getting more instructors to teach in simulators could be key to changing a culture of still somewhat resistant to the idea of simulation.
Rather than being cramped in a tank in miserable January weather, when the test was performed, instructors teaching in the simulator got to enjoy a warm environment. They were also equipped with wireless Xbox gaming console controllers and their own screen, so they could zoom out and see the tank from the outside. In addition, cameras in the simulator and on the student provided extra views of the trainee’s feet and gaze, as well as ample footage for after-action review.
The simulator used in the exercise was made specifically for the test and was a collaboration among XPI, Qinetiq, CAE and NSC. Built from the ground up with mostly commercial off-the-shelf components, the simulator featured a motion platform, vibration, accurate pedals and steering, and it corresponded as closely as possible to the Warrior. The group also took lidar scans of Bovington Training Range, where the live group was practicing, to create as immersive and accurate a picture as possible.
The simulator doesn’t do everything a production version would need to do, such as certain procedural steps (like starting the vehicle) or faults and failures, or command and control board systems. However, for basic training, it answered the question of whether empirical data supports the idea that simulators are just as good at training as an actual vehicle.
Students called the simulator novel, fun and entertaining, researchers said. Rather than sitting in the back of the vehicle during live training and being unable to hear the instructor, simulation students would gather around an extra monitor to watch their fellow students. And while those who had the live training showed little desire to try simulation, those in the simulation group said they would enjoy more simulation training in the future.
Finally, an independent assessor was asked to guess whether a student during the end test (in a live vehicle) had been trained using a simulator or not. He was right as often as he was wrong, showing there was little difference in the performance of the operators.