Technology: Insects help with the problem of artificial human vision

日期:2019-02-28 01:03:05 作者:熊撬罚 阅读:

By LIZ GLASGOW in CANBERRA SCIENTISTS at the Australian National University (ANU) in Canberra have developed artificial eyes that ‘see’ the way that flying insects do. Like many insects, they use the cues provided by the apparent motion of the object to determine its direction and range. This work could lead to the development of unobtrusive visual aids for blind people as well as in robotics and the space and military industries. The Australian team believes that its approach could help overcome some of the problems that have plagued the development of artificial systems which attempt to simulate human stereo vision. Adrian Horridge, director of the ANU’s centre for visual sciences, says it is presumptuous to try to copy human perception for a mobile device. ‘Natural visual systems have evolved along economical and obviously effective lines and if we want to make a mobile robot that sees we would be quite satisfied if early models perform as well as insects,’ he says. ‘Insects with their comparatively simple eyes and tiny brains are quite capable of navigating, chasing mates or prey and escaping from danger. If you want the performance of a fly, copy the principles from a fly.’ Unlike human beings, insects are unable to use binocular vision to judge range. Research has shown that the bee and possibly other flying insects use the apparent motion of an image across the retina to gauge its range in the same way that we can effectively see objects in three dimensions with one eye shut by moving the head. Knowing its own speed and the apparent velocity of the object, the bee gauges the object’s distance. Experiments have shown that the bee can also measure angular velocity, and therefore distance, irrespective of the structure of its surroundings. These findings formed the basis for two devices. One model, a hand-held scanning device, has an eye built from the lens of a microscope and a charge-coupled device (CCD), a light-sensitive electronic chip. A line of 128 pixels 1.6 millimetres long detects boundaries between light and dark objects. As the scene is scanned by moving the device at a steady rate from side to side, closer objects move across the field faster than more distant objects. Using a theoretical model of the perception of motion, called the gradient model, a computer calculates the velocity for each point in the scene by measuring changes in the intensity of light at each angle. This gives the range of objects in each direction. The device then converts the velocities into stereo tones, creating an aural space in which the motion of objects is effectively heard. Nearer objects pass across the aural space faster, and with a higher tone frequency, than more distant objects. Alternatively, it could be used to produce tactile sensations to describe the scene. The second device using five photodetectors gives high resolution in a field of view of about 2 to 3 degrees over a range of 1 to 5 metres. It can pick up a small object such as a pencil at a distance of 2 metres. This system uses angular and spatial gradients, detected by the five elements, to calculate the range of objects. The device does not have to be scanned; it simulates motion by essentially combining five snapshots simultaneously and producing a weighted sum of the outputs. Like the first device, it encodes the distance of the object through an audio amplifier as the pitch of a note. Both devices, says the team, have advantages over conventional ultrasonic devices which are prone to giving ambiguous range readings because of reflection of the sound beam, and their resolution copes poorly with small objects. The scanning device gives the user or robot a panoramic view, picking out a series of objects at different ranges in each direction. It can also locate objects at very close range. The five-detector camera has a narrower field of view but can detect immediate hazards in the direction it points. While two devices have been patented,