The University of Sussex

Neural networks for visual tracking in an artificial fly

D.T. Cliff

To appear in: Proc.\ 1st European Conference on Artificial Life (ECAL91), Paris, December 11-13, 1991. This paper reports on work using artificial life techniques to study issues in low-level animate vision. Animate vision is visual processing performed with dynamic control of the position/orientation of the image-acquisition device (eye/camera). Male Syritta pipiens hoverflies perform animate vision that is closely analogous to some aspects of human animate vision. This paper describes experiments with an artificial animal (i.e. an animat) called SyCo. SyCo has been constructed to explore possible processing strategies that reproduce, at the behavioural level, Syritta's animate vision capabilities. The processing strategies are embodied within artificial neural networks which effect local, rather than centralized, control of flight behaviour. Each network is responsible for the generation of one particular visually-mediated behaviour. The design principles for SyCo have been influenced by Braitenberg's Vehicles and by Wilson's Animats: the emphasis is on minimalism so that specifications are simple rather than complex. SyCo exists within a dynamic simulated environment that includes other flies. Typically SyCo will select another fly as a visual target, and then dynamically orient itself to keep the target in the centre of the field of view. Simultaneously, SyCo regulates a constant distance to the target. Distance regulation is achieved without explicit internal representation of distance. Seemingly complex behaviours arise from the interaction of a simple agent with its dynamic environment.

This paper is not available online