An aircraft carrier’s flight deck is constantly monitored, with every launch, recovery and taxi captured on video and broadcast throughout the ship. But when unmanned aircraft enter the U.S. Navy fleet, an entirely new camera may be installed, and the sailors moving planes around the deck will be the stars.
Researchers at MIT have developed a system that allows a camera and computer to recognize the hand signals that sailors use to guide unmanned aircraft around the flight deck, a feat that could eventually enable sailors to move a UAV with little more than a wave.
It’s a step in answering one of the biggest challenges in unmanned naval aviation: how to safely control a UAV on a hectic carrier flight deck while maintaining the cycle of aircraft launches and recoveries.
“It would be really nice if we had an unmanned vehicle that can understand human gestures,” said Yale Song, a doctoral candidate at MIT who developed a system.
His work is the UAV equivalent of an aviation boatswain’s mate flashing hand signals to pilots before an aircraft launch and after a trap, or landing. In effect, Song has developed a way for a UAV to “see” the signals and identify the commands the signals represent.
“Gesturing is an instinctive skill we all have, so it requires little or no thought, leaving the focus on the task itself, as it should be, not on the interaction modality,” Song and his colleagues wrote in a paper that appears in the March issue of ACM Transaction on Interactive Intelligent Systems, an academic journal.
Song’s project works with a camera monitoring an aviation boatswain’s mate’s hand gestures. The camera instantly sends the images to a computer program he developed that can interpret the sailor’s signals. With future research, a UAV may be able to understand that signal and maneuver around the flight deck as gingerly and deliberately as its manned counterpart.
Song’s project, which began in January 2009, was funded by the Navy’s Office of Naval Research. He traveled to Naval Air Station Pensacola, Fla., and learned hand signals used on flight decks.
From there, he returned to MIT with a Naval Air Training and Operating Procedures Standardization manual and taught 20 students how to perform 24 gestures. Video footage from his lab shows a student wearing a yellow turtleneck and a cranial device, just like a shipboard aircraft handler.
All 20 students performed the 24 signals in front of his camera, which translated their hand motions and body pose into a stick figure. With that, Song developed an algorithm to “learn” how to identify signals from people it had never seen before.
“Based on that training data, we trained our model so that when new data comes in, it has our algorithm to classify the sequence of gestures,” he said.
In the journal article, he wrote that the system accurately recognizes gestures 75.37 percent of the time.
There needs to be more research before his work is installed on a UAV, he said.
For one, his project used a calm, well-lit laboratory setting, not a cluttered and fast-moving environment like a flight deck. As such, he must find a way to negate these variables so they don’t interfere with his system, he said.
“We started from this controlled environment. Obviously, on the deck they have a glaze effect [on the camera lens] and a lot of people moving along, so we didn’t take those sorts of factors [into account], but that’s future work,” he said.
Remote Control Approach
While Song and other researchers at MIT have developed a way to recognize hand signals to control UAVs, Northrop Grumman has developed a special remote control system for maneuvering the company’s unmanned combat air system — the X-47B — on flight decks.
Tighe Parmenter, director for business development for North-rop’s Unmanned Combat Air System Aircraft Demonstration team, said his company’s UAV will be moved by an “experienced operator” equipped with a control device that’s attached to the wrist, waist and one hand.
Controllers will have access to both a display and controls that will allow them to adjust the plane’s throttle, lower the tailhook, apply the brakes and several other functions. The commands will go from the control system to the UAV via a digital datalink.
“There are remote-controlled trucks, there are remote-controlled airplanes, but this is on steroids,” Parmenter said.
Northrop has tested the control system with X-47B mission operators, all of whom are pilots, many with naval aviation backgrounds.
“One of the biggest things we’re proud of is that every function we have on this airplane can be done by a person who is smart on naval aviation,” he said. “A mission operator could be a senior enlisted person who is perhaps an air traffic controller or an [operations specialist]. The interfaces are very simple: keyboard and mouse.”
Northrop also has developed and tested a system for autonomous landings. In July, software and hardware developed for the X-47B was installed on an F/A-18D Hornet, allowing the fighter jet to emulate the UAV’s landing program.
The modification allowed theX-47B’s program to take control of the Hornet and land the plane on the deck of the carrier USS Dwight D. Eisenhower while the plane’s pilot kept his hands off the controls. The process was aided by a series of automatic messages between the plane and the carrier. The plane’s approach was tracked with GPS, and automatic adjustments were made to ensure the Hornet was on the proper trajectory.
Besides the new hardware and software, that flight looked just like any other manned landing. And that was the point: The X-47B is supposed to operate exactly like a manned aircraft, with no special procedures, accommodations and allowances.
There is still human interaction in a UAV landing similar to what occurs in a manned recovery. For one, both manned and unmanned aircraft need to receive clearance from air traffic control. From there, the landing signal officer is responsible for safety. The LSO must still accept the plane via the “pickle switch,” and can wave off an approach if an unsafe condition develops.
Mary Cummings, a former Hornet and A-4 Skyhawk pilot who is a professor at MIT’s Humans and Automation Lab, said the most difficult part of putting UAVs on a carrier is coordinating sensors on the flight deck and making sure everyone has access to all the information they need.
“There’s no question it could be done; it’s a technical feat,” said Cummings, who researches unmanned vehicles.