El-E isn’t designed to behave like a human, rather, its purpose is to interact with us. It simply grabs stuff you point at with a laser.
“The entire world becomes a point-and-click interface. Objects become buttons. And if you point at one, the robot comes to grab it,” said Charlie Kemp, the director of Georgia Tech’s Center for Healthcare Robotics and the robot’s designer. “It creates a click-able world.”
El-E, which was unveiled Wednesday at an Amsterdam conference, will be tested this summer in a real-world setting involving patients with a degenerative disease.
The robot, El-E, is primarily constructed from off-the-shelf components. Its mobile base is an ERRATIC platform from Videre Design, which includes an on-board computer with a Core Duo processor and 1 GB of memory with which the robot performs all its computation. The computer runs Ubuntu GNU/Linux and the engineers have written most of the software with Python and occasionally C++. They also make use of a variety of open source packages including SciPy, Player/Stage and OpenCV.
When commanding the robot, the user points a laser at something for a few seconds. The robot responds with a beep and then zeros in on the target. Once there, it lifts a mechanical arm and grabs the object.
It begins the return trip when the laser is pointed at the user’s feet, and it looks for a human face before handing over what it grabbed.
Kemp said engineers are often too focused on making robots behave like people, ignoring other ways they can interact.
“How can you make robots that are actually useful? That was bugging me,” he said. “And it’s a hard question to answer Ã¢â‚¬” that’s why I’m happy with this. We made technical contributions as well as something that actually helps users.”
The robot successfully fetches its target objects off the floor 90 percent of the time.