Bees inspire robot aircraft.


Scientists at Australia’s Vision Centre have discovered how the honeybee can land anywhere with utmost precision and grace – and the knowledge may soon help build incredible robot aircraft.

By sensing how rapidly their destination ‘zooms in’ as they fly towards it, honeybees can control their flight speed in time for a perfect touchdown without needing to know how fast they’re flying or how far away the destination is.

Srinivasan-Groening-Soccol-McGrath_honeybee

This discovery may advance the design of cheaper, lighter robot aircraft that only need a video camera to land safely on surfaces of any orientation, says Professor Mandyam Srinivasan of The Vision Centre (VC) and The University of Queensland Brain Research Institute.

“Orchestrating a safe landing is one of the greatest challenges for flying animals and airborne vehicles,” says Prof. Srinivasan. “To achieve a smooth landing, it’s essential to slow down in time for the speed to be close to zero at the time of touchdown.”

Humans can find out their distance from an object using stereovision – because their two eyes, which are separated by about 65 mm, capture different views of the object. However, insects can’t do the same thing because they have close-set eyes, Prof. Srinivasan explains.

“So in order to land on the ground, they use their eyes to sense the speed of the image of the ground beneath them,” he says. “By keeping the speed of this image constant, they slow down automatically as they approach the ground, stopping just in time for touchdown.

“However, in the natural world, bees would only occasionally land on flat, horizontal surfaces. So it’s important to know how they land on rough terrain, ridges, vertical surfaces or flowers with the same delicacy and grace.”

In the study, the VC researchers trained honeybees to land on discs that were placed vertically, and filmed them using high speed video cameras.

“The boards carried spiral patterns that could be rotated at various speeds by a motor,” says Prof. Srinivasan. “When we spun the spiral to make it appear to expand, the bees ‘hit the brakes’ because they thought they were approaching the board much faster than they really were.

“When we spun the spiral the other way to make it appear to contract, the bees sped up, sometimes crashing into the disc. This shows that landing bees keep track of how rapidly the image ‘zooms in’, and they adjust their flight speed to keep this ‘zooming rate’ constant.”

“Imagine you’re in space and you don’t know how far away you are from a star,” Prof. Srinivasan says. “As you fly towards it, the other stars ‘move away’ and it becomes the focus. Then when the star starts to ‘zoom in’ faster than the regular rate, you’ll slow down to keep the ‘zooming rate’ constant.

“It’s the same for bees – when they’re about to reach a flower, the image of the flower will expand faster than usual. This causes them to slow down more and more as they get closer, eventually stopping when they reach it.”

The VC researchers also developed a mathematical model for guiding landings, based on the bees’ landing strategy. Prof. Srinivasan says unlike all current engineering-based methods, this visually guided technique does not require knowledge about the distance to the surface or the speed at which the surface is approached.

“The problem with current robot aircraft technology is they need to use radars or sonar or laser beams to work out how far the surface is,” Prof. Srinivasan says. “Not only is the equipment expensive and cumbersome, using active radiation can also give the aircraft away.

“On the other hand, this vision-based system only requires a simple video camera that can be found in smartphones. The camera, by ‘seeing’ how rapidly the image expands, allows the aircraft to land smoothly and undetected on a wide range of surfaces with the precision of a honeybee.”

‘Primitive’ brain recognises edges.


Scientists at Australia’s Vision Centre (VC) have found a group of rare cells in the human brain that recognise edges – helping us to avoid accidents and recognise everything we use or see in daily life.

BruceRolff_brain_shutterstock

To their surprise, they located the cells in the ‘primitive’ brain – the part of our brain that was previously just thought to pass information from the eye to the higher brain, or cortex, to interpret it.

Their discovery has thrown new light on how the vision system of humans and other primates operates – and how we use vision to move around, find food, read, recognise faces and function day-to-day.

Importantly, the knowledge could help develop medical devices for reversing blindness such as the bionic eye, says Professor Paul Martin of The VC and The University of Sydney (USyd).

“Our eyes and brain work together to give us a recognisable world,” Prof. Martin explains. “The eyes send the light signals they detect to the cortex or ‘modern’ brain which is responsible for higher functions like memory, thought and language.”

“Our vision cells respond to different information – some to colour, some to brightness, and now we’ve found the ones that respond to patterns,” Dr Kenny Cheong of The VC and USyd adds. “If you look at your computer screen, you’ll see it has four sides, and each side has an orientation – horizontal or vertical. The cells are sensitive to these ‘sides’.”

What most surprised the researchers was the location of these cells. “We found these cells in the thalamus, which previously was only thought to pass information from the eyes to the cortex,” Dr Cheong says.

“This means that the cortex, or the ‘new’ brain, isn’t the only place that forms an image for us,” says Prof. Martin. “Even in the early stages, there are multiple pathways and signals going into the brain, so it isn’t simply doing a step by step construction of the world.

“While other animals including cats, rabbits, bees and chickens also have edge detecting cells, this is the first study to indicate that primate vision – including human vision – does not all happen in the cortex.”

These cells are also exceedingly rare, Prof. Martin says. “We actually saw them ten years ago, but these were a few cells out of thousands, so we thought that it was a mistake and discarded the data.

“But they cropped up every once in a while, and when we finally put them together, they look much more like cells in the cortex than in the thalamus.”

Dr Cheong says the study provides a better understanding of the visual system, which is crucial for the development of devices or treatments to restore vision.

“People who lose their vision lack the nerve cells that respond to light, which contains information such as colour, brightness and patterns,” he says. “So to develop a device like the bionic eye, we have to replicate the visual system, including these cells, using electronics. This means we must know what cells are present, how they work and what information they send to the brain.”