Hand Signals: the next step to controlling UAVs on aircraft carriers

The environment you find on the flight deck of an aircraft carrier is constantly monitored. The organized chaos of launches, recoveries and taxi takes place in a totally unforgiving environment for an unmanned aircraft (and for manned planes too…).

According to an interesting article published by Navy Times, researchers at the Massachusetts Institute of Technology (MIT) took a very close look at the problem of moving UAVs (Unmanned Aerial Vehicles) about the deck whilst not endangering crew or interfering to the normal operations and they came up with an ingenious camera and computer that recognises the hand signals the sailors use to guide aircraft about an aircraft carrier deck.

It may be a step that finally makes UAV use on a aircraft carrier possible. “It would be really nice if we had an unmanned vehicle that can understand human gestures” said Yale Song a Ph.D candidate at MIT who developed the system.

“Gesturing is an instinctive skill we all have, so it requires little or no thought, leaving the focus itself, as it should be, not the interaction modality” said Song.

Song’s project which began in January 2009, and was funded by the Office of Naval Research, took him to Naval Air Station Pensacola, Florida, where he learned the hand signals used by the sailors on the flight deck that he used to “train” 20 students 24 signals. The students wore a Yellow Turtleneck and a cranial to replicate the clothing used onboard carriers.

The students performed all of the signals whilst being filmed by Song’s camera/computer combination, which in turn translated their hand movements to stick figures. With this data, Song was able to develop an algorithm that is able to learn how to identify and recognize the signals from people it hadn’t met before therefore hadn’t learned their individual slight variables.

Song said “Based on that training data, we trained our model so that when new data comes in, it has our algorithm to classify the sequence of gestures.”

Song admitted that his system gets the gestures correct around 75 percent of the time, so obviously a lot of more research is needed before this system could be introduced onto an unmanned air system.

According to the Navy Times article, while Song and MIT look into recognizing hand signals, Northrop Grumman has developed a special remote control for moving the X-47B on flight decks by means of a device which attaches to the wrist, waist and one hand. The “yellow shirt” operating the device will have access to a display and  will be able to control the aircraft’s throttle, tailhook, brakes and perform several other functions associated with maneuvring an aircraft on deck.

Image credit: U.S. Naval Air Systems Command

Anyway, drone operations automation has already reached aircraft carriers, at least for testing purposes.

An automated landing system, which allowed the X-47’s controllers to take control of an F-18, fly the approach and land the plane onto the flight deck of USS Dwight D Eisenhower whilst the Hornet’s crew makes no input into the plane’s flight, has already been tested. Seen from the outside, the landing looks totally normal. The LSOs still has the power to wave off the landing should they feel that the landing is unsafe or does not meet any other criteria required for a trap landing.

Richard Clements for TheAviationist.com