Stories from ANU Reporter and ANU News
CECS Spotlight: Eye robot
A new study of bee vision through robotics could revolutionise the way we drive.The first law of robotics states a robot cannot allow a human to come to harm. When writer Isaac Asimov published these laws in 1942, the limitations of technology meant such autonomous robots could only exist in the field of Asimov's renown - science fiction.
Today, researchers are developing the tools for vision systems that could keep people from harm, putting Asimov's law into practice. Helping them achieve this end is a team of unassuming robots that resemble oversized robots on wheels.
Dr Nick Barnes is an adjunct fellow with the Department of Systems Engineering within the Research School of Information Science and Engineering at ANU and a researcher at National ICT Australia (NICTA). In collaboration with a team of colleagues and postgraduate students, his goal is to develop computer vision that will assist people when they are driving.
"We're not trying to take control of the vehicle, but helping people to drive more safely. Our systems could alert the driver when he is getting too close to an obstacle, or tell him where the traffic signs are. This could be especially useful for older drivers who have more trouble registering the environment around them."
Most road situations are complex. Not only are cars constantly moving, other vehicles are jockeying around them too. In order to register these shifting surroundings, Dr Barnes and his team believe the best vision model could come from an unlikely source: a bee's eye.
"There are two ways in which biology allows vision. One of them is fixated, like primates, where, rather than taking in an entire scene, you're looking to particular points in that scene and gathering information from fixations.
"Under motion, if I'm walking along in an environment, and I fixate to a point, I add a set of constraints to my visual impression of the environment that make it easier to understand what is going on. So if I look at a particular object, I can estimate the proximity of objects around it, what my path is, and my speed.
"But insects like bees have a full, spherical point of view. They fixate to every point in the environment, and can see everything going on around them. If we can learn more about how they use this kind of vision, the potential for autonomous robotic systems is exciting."
The research team will use two 190-degree cameras to replicate the vision of a bee, and then sit the eyes on a mobile robot that will simulate flight. These small, disc-like robots can roll in any direction, while a tower and tilt-platform allow the cameras to lift up and down in any direction. In this way, they hope to better understand how bees see the world, and how this could be incorporated into driver-assistance applications.
"The raw robot we're working with is intended to be highly mobile and flexible, so we can get a very free range of quite precise motions. We could have gone with a gantry robot, where you've got very precise movements, but then you've got problems around restricted workplaces.
"With a mobile robot, you can have a much larger workspace. This will also help us understand such questions as how bees accurately record distance over long distances."
The use of vision in computer systems has been Dr Barnes' driving interest since his student days, playing a central part in his ongoing robotics research. It's also led to some more tangential pursuits, such as directing a team of robots in the Robocup soccer tournament in Japan.
"Vision is a hard problem, and there are lots of interesting aspects of it to study. I'm more interested when it's involved in a system, primarily, and also when it's a system that is moving around and trying to understand the environment is where I'm coming from as a researcher," he says.
"I think cognition and the way people understand the world is one of the great unknowns. Perception is one of those ways to take this information that comes in and turn that into functional ability. I prefer to study perception rather than language, say, because it is closer to primary reflexes.
"Language has the barrier of the need to talk to people, and people are quite high-level. But with perception, we can study things like insects which are relatively simple, and don't have the high-level complexity of what they're doing. People are capable of solving problems in a whole variety of ways, so it's hard to say how is this person doing this. Whereas with an insect it's a little bit easier to break it down."
Understanding the vision of the humble bee could eventually mean insect-vision systems become commonplace in cars, trucks and busses, helping to guide drivers and saves lives. Robots that keep humans from harm - the twinkle in Asimov's eye could be just around the corner.