Catching Up: Sailors aboard USS Minnesota and other attack submarines will soon train on touchscreens so sensitive they can detect pressures, not just locations. This is forcing the simulation industry to develop new gestures for interacting with screens. (Mark D. Faram/staff)
- Filed Under
ORLANDO, FLORIDA — In the US Navy, simulation and virtual reality have long been mostly an aviator’s game, but that’s changing. Today, some of the Navy’s most innovative training systems are under development for the submarine and surface communities.
The Navy will soon field its first 3-D Weapons Launch Console Tram Trainer at the Submarine Training Facility in Pearl Harbor, Hawaii. It’s designed to simulate loading and launching weapons on fast attack submarines, allowing teams to practice on everything from torpedoes to missiles, startup checklists to shutdown procedures.
Arrays of flat-panel screens can change in minutes, to offer everything from a 688-class torpedo room to a Seawolf- or Virginia-class space. Touch-sensitive displays offer 3-D views of launch controls, torpedo tubes and weapons-handling gear.
But here’s where the experience gets wild: The screens — some up to 55 inches — are not just touch-sensitive but pressure-sensitive. There’s no mouse-clicking to make things happen; sailors are expected to reach out and “grab” the objects they want to manipulate.
Trainees are meant to get the feel of twisting valves and turning screws — to the point that if you drop the screw, the simulation might take you to your hands and knees to look for the errant fastener.
“It’s about incorporating physics-based models into what was before a very flat simulation designed to familiarize you with where things were,” said Capt. Wes Naylor, executive officer of the Naval Air Warfare Center Training Systems Division. “This takes that to a whole different level by actually being able to operate with your hands the valves and switches, and doing multiperson and team events such as raising or lowering a weapons cradle, giving you more a feel for actually doing it in real life.”
In fact, this simulator is pushing the state-of-the-art in haptics, or tactile feedback, and is helping to move the simulation industry itself to a new level.
Right now, the touch interface allows sailors to use both hands on the screen in a very intuitive manner, allowing navigation around the weapons space and controls and single hands to actually twist valves and flip switches and knobs. But developers are finding they need more gestures than are currently available.
“We’re pushing the envelope on what gestures you can recognize on a 3-D panel,” said Dave Williams, deputy director of NAWCTSD’s undersea division. “Every sailor comes into the Navy these days knowing how to operate an iPad or an Android tablet — but when you are in a 3-D environment on a touch screen and you have to operate a valve with both hands, or flip a toggle switch or turn a knob, some of the gestures just aren’t there.”
So, to help out the effort, the Office of Naval Research has given the University of Central Florida a small-business contract to develop and code additional gestures. Williams said these gestures will be made available to the training and gaming industries once the contract is complete.
These haptic interfaces will also be a cornerstone of the Immersive Virtual Ship Environment being developed to train sailors for the new littoral combat ship.
The IVSE is to be a complete virtual ship that sailors can walk around and do individual and team simulations. When complete, the system’s multiperson, game-engine-based program will offer more than 3,500 courseware hours for LCS crew as well as the teams that come on board with the modular packages of mission-specific equipment.
“Our mandate to bring these sailors to a qualification level in a shore-based training environment before they report to their ship,” Naylor said. “So we are putting them in a virtual ship and running them through, in a story line, using high-end gaming technology, a scenario that is a deployment cycle of that ship and what’s expected of them in watch standing, operations and maintenance.”
This virtual LCS will have plenty of touch screens, but will also include actual panels, consoles and controls that replicate the look and feel of the real thing.
“Developing these haptic interfaces are key, so when our research tells us that a certain task needs both hands in a very realistic scenario as in the gear they’re going to be using, having a haptic interface to replicate that task is required,” said Scott Burlingame, deputy for LCS training in NAWCTSD’s surface directorate. “But, we’re only doing that when it’s necessary to satisfy the training objective. We don’t have the budget and bandwidth just to do it because it’s cool. We want to do that when it’s required to train the sailors.”
As more haptic interfaces are developed, Naylor says, they can easily and quickly be incorporated into newer versions of the software and hardware in the simulations.
“It doesn’t require, as often is the case with systems in the fleet, going back through a totally new acquisition cycle,” he said.
As Maureen Bergondy-Wilheim, director of NAWCTSD’s Research and Technology Programs Office, sees it, the need for ever more realism in training is driving the development of new haptic interfaces, which will lead to more cooperation between developers of game-based training and the entertainment gaming industry.
“There’s still some fundamental problems with haptic interfaces that we do not, as yet, have answers to,” she said. “The gaming industry will probably address some of them, and others will probably be left for us to solve and in the end, everyone benefits.” ■