Now, robots that respond to human gesturesMarch 12th, 2009 - 1:19 pm ICT by ANI
Washington, Mar 12 (ANI): Autonomous, do-it-all robots are no longer the domain of science fiction or cartoons like The Jetsons, thanks to scientists from Brown University who have developed a robot that responds to human gestures.
The research team has demonstrated how a robot can follow nonverbal commands from a person in a variety of environments indoors as well as outside all without adjusting for lighting.
“We have created a novel system where the robot will follow you at a precise distance, where you don”t need to wear special clothing, you don”t need to be in a special environment, and you don”t need to look backward to track it,” said Chad Jenkins, assistant professor of computer science at Brown University and the team’’s leader.
During the study, the researchers including Brown graduate students, showed how the robot responded as they used variety of hand-arm signals to instruct the robot, including “follow,” “halt,” “wait” and “door breach.”
The Brown team started with a PackBot, a mechanized platform developed by iRobot that has been used widely by the U.S. military for bomb disposal, among other tasks.
They outfitted their robot with a commercial depth-imaging camera. They also geared the robot with a laptop that included novel computer programs that enabled the machine to recognize human gestures, decipher them and respond to them.
The researchers made two key advances with their robot. The first involved what scientists call visual recognition. They created a computer program, whereby the robot recognized a human by extracting a silhouette, as if a person were a virtual cutout.
This allowed the robot to home in on the human and receive commands without being distracted by other objects in the space.
The second advance involved the depth-imaging camera, where they used a CSEM Swiss Ranger, which uses infrared light to detect objects and to establish distances between the camera and the target object, and measure the distance between the camera and any other objects in the area.
The result is a robot that doesn”t require remote control or constant vigilance, Jenkins said.
“What you really want is a robot that can act like a partner,” Jenkins added. “You don”t want to puppeteer the robot. You just want to supervise it, where you say, ”Here’’s your job. Now, go do it.””
The study was presented at Human-Robot Interaction conference, in San Diego. (ANI)
- Now, 'pruned' microchips that are faster, smaller, more energy-efficient - Mar 17, 2011
- 'Pruned' microchips faster, smaller, greener - Mar 17, 2011
- In 10 years: Affordable robots to complete daunting home-tasks - Sep 26, 2010
- Robot collects radiation data at crippled Japan n-plant - Apr 19, 2011
- Microsoft India employees display innovative skills - May 17, 2012
- Can robots be blamed for killing civilians? - Apr 24, 2012
- 'Intelligent co-pilot' makes driving safer - Jul 15, 2012
- Now, an underwater robot that swims via wireless controller - Oct 01, 2010
- Robot makes debut performance on stage - Nov 16, 2010
- Now, a robot that will understand emotions - Aug 24, 2012
- Indian-American working on robots to improve daily life - Sep 27, 2010
- Human cues to improve computer's user-friendliness - Mar 06, 2011
- 30pc of U.S. Army may be robotic by 2020 - Aug 05, 2008
- Scientists create robot which registers human emotions - Jul 16, 2012
- New approach brings robot pets closer to reality - Sep 18, 2010
Tags: arm signals, assistant professor, bomb disposal, brown university, chad jenkins, computer program, computer programs, depth imaging, graduate students, hand arm, human gestures, imaging camera, infrared light, irobot, novel system, packbot, robot, silhouette, target object, visual recognition