Studies in computer vision have only recently realised the advantage of adding a behavioural component to vision systems, enabling them to make programmed 'eye movements'. Such an animate vision capability allows the system to employ a nonuniform or foveal sampling strategy, with gaze-control mechanisms repositioning the limited high-resolution area of the visual field. The hoverfly Syritta pipiens is an insect that exhibits foveal animate vision behaviour highly similar to the corresponding activity in humans. This paper discusses a simulation model of Syritta created for studying the neural processes underlying such visually guided behaviour. The approach differs from standard "neural network" modeling techniques in that the simulate Syritta exists within a closed simulated environment, i.e. there is not need for human intervention: such an approach is an example of computational neuroethology.
This paper is not available online