Augmented reality is coming soon. We’re already salivating over Google’s Project Glass eyewear, which allows directions, facts and markers to materialize before our eyes as we gaze around urban landscapes. It feels like becoming the hero of a sci-fi movie.
To the military, augmented reality holds tantalizing promise for operations and training alike. Rather than creating virtual environments, which place users in completely fake (though often realistic) environments, augmented systems overlay information on the actual scene before users’ eyes. In theater, AR goggles might add information about targets or the surrounding area; in training, they would allow all manner of threatening or nonthreatening elements to be inserted into a real landscape.
“AR has the potential to be more of a game-changing application than even virtual task training in performance support, because there is more of a real-time, dynamic, on-the-fly solution, where you’re getting the latest information as you need it,” said Josie Sutcliffe, the vice president of marketing for 3-D model and maintenance training company Ngrain.
The challenge for the many companies working on augmented devices, besides miniaturizing the technology, is deciding what content to include. Incorporating useful information cleanly into a visual system isn’t easy when you are trying to cover all aspects of daily human life. AR designed for the specific goal of training, on the other hand, is within closer reach.
First Step: Maintenance Training
Various companies, in both the training and simulation industry and the wider commercial sphere, are looking into augmented reality for maintenance training. It’s a logical fit.
“One thing we have definitely seen increasing over the past few years is the convergence of the training and performance support world, where you have training technologies being taken from a training environment and applied on the job, and vice versa,” Sutcliffe said.
The military has many systems to maintain — weapons, vehicles and electronic equipment — but they are all composed of well-defined parts. There are only so many pieces in an engine, and developers can teach computers to recognize the shapes and how they fit together. Many of them are already modeled.
At I/ITSEC, Ngrain plans to debut a new augmented reality technology demo, largely developed over the past year. By viewing a live piece of equipment (a pump used in marine applications by the military) through an iPad’s camera, users can tap into maintenance information on the parts. In addition to providing pop-up information, the program will also have 2-D and 3-D graphical overlays that highlight certain components. There are also video and text augmentations. Finally, step-by-step instructions will augment a maintenance procedure.
Such a system could be adapted for students learning the procedure for the first time or serve as a guide or refresher for those out on the job. Rather than having to go back and forth between the computer or manual and the equipment, trainees could combine the two into one seamless experience.
Moreover, Ngrain’s technology uses volume graphics based on “voxels”: pixels with X, Y and Z coordinates instead of just X and Y. The voxels have volume and a surface — think of them as very small cubes — and this allows them to display changing attributes in real time. Thus, they can show signals, temperature information or hot spots that an Army mechanic needs to avoid, or recognize and represent real equipment and parts in 3-D.
Such voxel data could be collected live or, for training purposes, come built in to the software. Needing fewer sensors would make it easier to train on the fly.
“AR integrates the schoolhouse with the field,” said Carl Byers from Logres Inc., which worked with Ngrain on the demo. “It really reinforces and integrates the concept of just-in-time training and on-the-job training, together with whatever has been learned in the schoolhouse.”
Of course, iPads don’t exactly leave your hands free for maximum maneuverability. But rather than developing a new format to experience augmented reality, Ngrain is using an easily obtainable, commercial off-the-shelf product. Part of making augmented reality work is adapting available technologies and putting them to work in a different way for training.
Other companies also are trying to pin down ways to do maintenance training with augmented reality. Pat Allen, the business development lead for Lockheed Martin’s maritime systems programs, noted that the systems being used at sea are growing increasingly complex and need to be revisited regularly by maintainers. The company’s goal is to use technology to “ease the burden” of trying to remember rarer procedures that still need to be performed correctly by providing tablet-based tools.
“We’re also looking into introducing an augmented reality feature that is going to have an operator or a maintainer use head- mounted equipment to overlay 3-D visualization while looking directly at whatever piece of equipment they are engaged with,” Allen said, noting that the project was still in the research and development phase and had not yet been developed into a product. “I would say in a year or so we would be able to provide something.”
Those hoping to overlay content for those learning about or practicing maintenance procedures have several options, though two are particularly common. Either the camera can pick up on the shapes and associate them with a particular piece of equipment or the component can be tagged with a barcode or some other marking that links to a database.
“When you’re able to take whatever device you’re using and associate it with that specific equipment, either through a barcode or a RFID [tag], then you can go into the cloud and retrieve everything about that specific piece of equipment,” Allen said. Many parts and pieces come with barcodes, meaning the system doesn’t need an overhaul to be practically used.
Byers noted that Ngrain and other companies were likely to move away from the marker-based system and into the markerless identification. If such augmented, step-by-step training emerges, maintenance could become more of a catch-all and less of a specialty.
“Right now the Navy trains people using Navy enlisted classification codes that say they are experts on some particular piece of equipment, whether it’s a radar or an engine,” Allen said. “With this kind of information available and the ability to intuitively guide you along, you can start asking questions of what is the right level of training.”
Adapting phones or tablet devices is an easy way to move into augmented reality training, but other companies are looking at new headgear. Google’s Glass project is perhaps the most well-known, but Motorola is also planning to ship a commercial headpiece, the HC1, in the first half of 2013.
The HC1 headset computer consists of a small eyepiece that has the same resolution as a 15-inch laptop screen, a set of microphones, sensors to pick up head movement and an earpiece. Controlled by motion and speech, the headset is designed to work in harsh environments and can be used for maintenance, repair, and training and sim applications.
Someone conducting engine or weapons repair, for example, could see videos, images or step-by-step instructions relating to the equipment in his peripheral eyepiece, which sits slightly below the eye and out of direct view. There are still some sensitivity issues to work out with the headset, which sometimes picks up outside noise or normal conversation and executes commands the user might not desire.
An optional camera seems to be the linchpin for remote augmented training with the device, according to Motorola’s innovation manager, Nicole Tricoukes. This camera picks up what the user is seeing and, via the headset’s Wi-Fi or Bluetooth connections, transmits it to a subject matter expert in the room or around the world. The expert can annotate and send back the images or transmit videos or instructions for display in the eyepiece.
One can imagine the technology evolving to allow training at home or refresher training abroad, connecting to experts who advise and teach in real time based on what the user is seeing.
More Than Maintenance
If companies are largely focused on AR for maintenance, other researchers are experimenting with using it for live-fire exercises.
The goal of the Augmented Immersive Team Training equipment being developed by the University of Central Florida with funding from the Office of Naval Research is to allow instructors to overlay aircraft, tanks, buildings, insurgents or weapons effects on a real landscape. This would allow, say, joint terminal attack controllers to head out to a training range to practice calls for fire, where they would confront all manner of aircraft, hostile and nonhostile forces, and explosions — all at a fraction of the cost of reality.
To produce the most realistic effects, the AITT tracks the user’s head and body movements with a wide array of sensors: cameras, GPS receivers, air pressure sensors, inertial measurement units and magnetometers. The data flowing from the sensors tell the central computer exactly where the wearer is looking, allowing precise positioning of augmented elements in the head-mounted display.
Still in prototype stage, the system was tested over the summer at the Princeton, N.J., site of SRI International.
If it becomes easier or cheaper to overlay weapons effects or aircraft on the actual landscape versus a virtual one, it will help shift training away from immersive domes or systems and out into the real world.