Carrier Landing An aircraft director signals to a pilot after a safe landing aboard the aircraft carrier USS Carl Vinson. US Navy via Wikimedia
Landing airplanes on moving ships is no mean feat, but this will be especially true when the airplanes are unmanned. Along with making decisions, autonomous airplanes will have to heed their human counterparts during aircraft carrier takeoff and landing - but can a robot read and understand arm-waving signals?
The problem is complicated in at least two ways - first, the airplane must determine whether the human's hands are up or down, elbows in our out. Second, it has to red which gesture the human is making, and what it means. MIT PhD student Yale Song is trying to solve these problems.
Song and fellow scientists recorded various people performing a set of 24 gestures aircraft carrier deck personnel use, including arm waving and folding, and hand movements. They built software that determined each person's elbow, wrist and hand positions, including whether palms were open and whether thumbs were up or down. They completed that portion of research last year. Then, the team had to classify all these gestures according to their meanings. But this is complicated, because deck signals are a complex ballet of movement - it's not like a seaman makes one motion and then stops for a beat before starting another. So the algorithm has to determine a gesture's meaning without a clear beginning, middle and end.
The team accomplished this by breaking down the videos into clusters of frames, in which the analysis overlaps. As MIT News puts it: "The second sequence might start at, say, frame 10 of the first sequence, the third sequence at frame 10 of the second, and so on." This way, the algorithm can calculate the probability that a given sequence belongs to one of the 24 gestures in the catalog. The researchers tested it and the algorithm correctly identified the hand signal sequences 76 percent of the time.
Song believes this can be improved by tweaking the computations, according to MIT. The work appears in the journal ACM Transactions on Interactive Intelligent Systems.
[MIT News]
The problem is complicated in at least two ways - first, the airplane must determine whether the human's hands are up or down, elbows in our out. Second, it has to red which gesture the human is making, and what it means. MIT PhD student Yale Song is trying to solve these problems.
Song and fellow scientists recorded various people performing a set of 24 gestures aircraft carrier deck personnel use, including arm waving and folding, and hand movements. They built software that determined each person's elbow, wrist and hand positions, including whether palms were open and whether thumbs were up or down. They completed that portion of research last year. Then, the team had to classify all these gestures according to their meanings. But this is complicated, because deck signals are a complex ballet of movement - it's not like a seaman makes one motion and then stops for a beat before starting another. So the algorithm has to determine a gesture's meaning without a clear beginning, middle and end.
The team accomplished this by breaking down the videos into clusters of frames, in which the analysis overlaps. As MIT News puts it: "The second sequence might start at, say, frame 10 of the first sequence, the third sequence at frame 10 of the second, and so on." This way, the algorithm can calculate the probability that a given sequence belongs to one of the 24 gestures in the catalog. The researchers tested it and the algorithm correctly identified the hand signal sequences 76 percent of the time.
Song believes this can be improved by tweaking the computations, according to MIT. The work appears in the journal ACM Transactions on Interactive Intelligent Systems.
[MIT News]
No comments:
Post a Comment