What better way to combine your nerdy loves of computer programming and Star Wars than with a robot that can actually battle with a light saber?
This is “JediBot,” a Microsoft Kinect-controlled robot that can wield a foam sword (light saber, if you will) and duel a human combatant for command of the empire. Or something like that.
“We’ve all seen the Star Wars movies; they’re a lot of fun, and the sword fights are one of the most entertaining parts of it. So it seemed like it’d be cool to actually sword fight like that against a computerized opponent, like a Star Wars video game,” graduate student Ken Oslund says in the video above.
The world of dynamic robotics and AI has been immensely aided by the affordable, hackable Microsoft Kinect. The Kinect includes multiple camera and infra-red light sensors, which makes recognizing, analyzing, and interacting with a three-dimensional moving object (namely, a human) much simpler than in the past. Microsoft recently released the SDK for the Kinect, so we should be seeing increasingly useful and creative applications of the device. The KUKA robotic arm in the video above is traditionally used in assembly line manufacturing, but you may remember it from a Microsoft HALO: Reach light sculpture video last year.
According to the course overview (.PDF) for the “Experimental Robotics” course, the purpose of the laboratory-based class is “to provide hands-on experience with robotic manipulation.” Although the other groups in the class used a PUMA 560 industrial manipulator, the JediBot design team, composed of four graduate students including Tim Jenkins and Ken Oslund, got to use a more recently developed KUKA robotic arm. This final project for the course, which they got to choose themselves, was completed in a mere three weeks.
“The class is really open-ended,” Jenkins said. “The professor likes to have dynamic projects that involve action.”
The group knew they wanted to do something with computer vision so a person could interact with their robot. Due to the resources available, the group decided to use a Microsoft Kinect for that task over a camera. The Kinect was used to detect the position of JediBot’s opponent’s green sword-saber.
The robot strikes using a set of predefined attack motions. When it detects a hit (when its foam light saber comes in contact with its opponent’s foam light saber and puts torque on the robotic arm’s joints), it recoils and moves on to the next motion. It switches from move to move every one or two seconds.
“The defense mechanics were the most challenging, but people ended up enjoying the attack mode most. It was actually kind of a gimmick and only took a few hours to code up,” Jenkins said.
The project utilized a secret weapon not apparent in the video: a special set of C/C++ libraries developed by Stanford visiting entrepreneur and researcher Torsten Kroeger. Normally, the robot would need to plot out the entire trajectory of its motions from start to finish — pre-planned motion. Kroeger’s Reflexxes Motion Libraries enable you to make the robot react to events (like collisions and new data from the Kinect) by simply updating the target position and velocity, with the libraries computing a new trajectory on-the-fly (in less than a single millisecond).
This allows JediBot to respond to sensor events in real-time, and that’s really the key to making robots more interactive.
Imagine a waiter-bot with the reflexes to catch a falling drink before it hits the ground, or a karate robot you can spar against for practice before a big tournament.
I doubt anyone would be buying their own KUKA robotic arm and creating a sword-playing robot like JediBot in their home, but innovations like this using interactive controllers, and the availability of the Reflexxes Motion Libraries in particular for real-time physical responses, could help us see robots that better interact with us in daily life.
Video courtesy: Stanford University/Steve Fyffe
"
No comments:
Post a Comment