How robot planes could learn carrier crew hand gestures
MIT researchers are trying to get computers to correctly interpret hand signals used by crews aboard aircraft carriers so that robot planes can follow them.
As Northrop Grumman continues to develop its X-47B robot stealth plane, which is aimed at carrier use, Yale Song and colleagues at MIT are working on a machine learning system that could allow autonomous planes to understand crew directions.
In its research presented in the journal ACM Transactions on Interactive Intelligent Systems, the team used a database of abstract representations of 24 gestures often employed by carrier personnel. They trained an algorithm to classify gestures, including posture and hand position, based on what it knew from the database.
As seen in the video below, the algorithm works with a single stereo camera. It analyzes each frame in a sequence and calculates the probability that the movement is part of a certain gesture.
It does that while keeping tabs on the probabilities for the whole sequence of gestures, and while recognizing gestures continuously.
When tested, the system was able to correctly identify gestures in the training database … [Read more]
Singing gloves give new meaning to jazz hands
Robots still lack the human touch
Secret VIP gets first Boeing 747-8 Intercontinental
Boeing’s Phantom Eye goes for a low ride
Boeing’s Dreamliner struggles despite tech superiority