German scientists have built a flight simulator for flies to better understand how the insects see and coordinate their movements. What they learn might be of use in developing robots that can move around their environment.
Photo courtesy USDA
“A fly’s brain enables the unbelievable–the animal’s easy negotiation of obstacles in rapid flight, split-second reaction to the hand that would catch it, and unerring navigation to the smelly delicacies it lives on,” says Technische Universität München in a statement about the research.
“Researchers have long known that flies take in many more images per second than humans do. For human eyes, anything more than 25 discrete images per second will merge into a continuous movement. A blowfly, on the other hand, can perceive 100 images per second as discrete sense impressions and interpret them quickly enough to steer its movement and precisely determine its position in space.
“Yet the fly’s brain is hardly bigger than a pinhead, too small by far to enable the fly’s feats if it functioned exactly the way the human brain does.
“It must have a simpler and more efficient way of processing images from the eyes into visual perception, and that is a subject of intense interest for robot builders,” TUM says.
Photo courtesy USDA
Robots have great difficulty perceiving their surroundings through their cameras, and even more difficulty making sense of what they see, TUM adds.
“Even the recognition of obstacles in their own work space takes too long. So people still need to protect their automated helpers, for example, by surrounding them with safety enclosures.”
A more direct, supportive collaboration between human and machine is a central research goal of the “excellence cluster” named CoTeSys, Cognition for Technical Systems, a collaboration of about a hundred scientists and engineers from five universities and institutes in the Munich area of Germany.
To understand how flies see and process their coordination, the CoTeSys group built a flight simulator for flies.
“Here they’re investigating what goes on in flies’ brains while they’re flying. Their goal is to put similar capabilities in human hands–for example, to aid in developing robots that can independently apprehend and learn from their surroundings,” TUM says.
Photo by David Braun
On a wraparound display, the researchers present diverse patterns, movements, and sensory stimuli to blowflies. The insect is held in place by a halter, so that electrodes can register the reactions of its brain cells, enabling the researchers to observe and analyze what happens in a fly’s brain when the animal whizzes in criss-cross flight around a room, TUM says.
Moving pictures displayed here simulate flight for an immobilized fly; electrodes give researchers a window into the fly’s neural activity and vision processing.
Photo courtesy Max Planck Institute for Neurobiology
The first results show one thing very clearly: The way flies process the images from their immobile eyes is completely different from they way the human brain processes visual signals, the university adds.
“Movements in space produce so-called ‘optical flux fields’ that characterize specific kinds of motion definitively.
“In forward motion, for example, objects rush past on the sides, and foreground objects appear to get bigger. Near and distant objects appear to move differently.
“The first step for the fly is to construct a model of these movements in its tiny brain. The speed and direction with which objects before the fly’s eyes appear to move generate, moment by moment, a typical pattern of motion vectors, the flux field, which in a second step is assessed by the so-called “lobula plate,” a higher level of the brain’s vision center.
“In each hemisphere there are only 60 nerve cells responsible for this; each reacts with particular intensity when presented with the pattern appropriate to it.
“For the analysis of the optical flux fields, it’s important that motion information from both eyes be brought together. This happens over a direct connection of specialized neurons called VS cells. In this way, the fly gets a precise fix on its position and movement.”
Image courtesy Max Planck Institute for Neurobiology
“Through our results, the network of VS cells in the fly’s brain responsible for rotational movement is one of the best understood circuits in the nervous system,” explains Alexander Borst, a neurobiologist from the Max Planck Institute for Neurobiology.
Under the leadership of Martin Buss and Kolja Kühnlenz, the TUM researchers are working to develop intelligent machines that can observe their environment through cameras, learn from what they see, and react appropriately to the current situation, the university explained.
“Their long-range aim is to enable the creation of intelligent machines that can interact with people directly, effectively, and safely. Even in factories, the safety barriers between humans and robots should fall. To that end, simple, fast, and efficient methods for the analysis and interpretation of camera pictures are absolutely essential.”
TUM researchers are developing small, flying robots whose position and movement in flight will be controlled by a computer system for visual analysis inspired by the example of the fly’s brain, the university said.
Robot Asks for Directions
Antoher TUM-built mobile robot, the Autonomous City Explorer (ACE), was challenged to find its way from the institute to Marienplatz at the heart of Munich–a distance of about a mile–by stopping passers-by and asking for directions. To do this, ACE had to interpret the gestures of people who pointed the way, and it had to negotiate the sidewalks and traffic crossings safely, TUM said.
“Increasingly natural interaction between intelligent machines and humans is unthinkable without efficient image analysis. Insights gained from the flight simulator for flies–through the scientific interplay CoTeSys fosters among researchers from various disciplines–offer an approach that might be simple enough to be technically portable from one domain to the other, from the insects to the robots.”
Navigating only by asking pedestrians it encountered for directions, the robot called ACE, or Autonomous City Explorer, made its way from the institute where it was built–at TUM, the Technische Universitaet Muenchen–to Marienplatz roughly a mile away. A project of the Munich-based CoTeSys collaboration, ACE is part of a larger effort to enable more natural, effective, and safe interaction between machines and people.
Photo courtesy LSR/TUM