Research

We combine lab and field studies with computational modelling and robotics to investigate the mechanisms underpinning insect navigation.

BBSRC Visual navigation in ants: from visual ecology to brain

We will use complementary methodological approaches to understand the nature of the visual memory that supports navigation in ants. We will develop procedures for carrying out brain lesions in ants, targeting a range of locations in central complex and mushroom body, and investigate the consequences in visual navigation tasks. Subsequent histology will allow us to correlate lesion locations with behavioural deficits. In parallel, we will establish a new experimental system using a compensatory treadmill to allow precise control over the visual stimulation provided to freely walking ants. This method will enable extremely rapid and minimally invasive transfer of ants from a conventional arena training paradigm to this controlled testing paradigm, supporting high-throughput experiments. These experimental methods will be coupled with analytical approaches to the information content in natural scenes from the ant habitat, to refine the stimulus paradigms and provide realistic input to computational models. An agent model (a simulated ant moving through a virtual world) will allow us to test specific algorithms for visual navigation under precisely parallel conditions to the animal, and thus allow us to devise crucial paradigms for the experimental system under which alternative models make different predictions. In particular, we will examine what are the critical eye regions, the essential image information content, and the most efficient and effective encoding and retrieval schemes to account for navigational behaviour. In the same agent model, we will also test more detailed models of the relevant brain circuitry, to understand how it could support such processing, and close the loop with predictions for new trackball and lesion studies and potential extensions towards single-cell electrophysiology of neurons in relevant brain regions.

EPSRC Brains on Board: Neuromorphic Control of Flying Robots

What if we could design an autonomous flying robot with the navigational and learning abilities of a honeybee? Such a computationally and energy-efficient autonomous robot would represent a step-change in robotics technology, and is precisely what the 'Brains on Board' project aims to achieve. Autonomous control of mobile robots requires robustness to environmental and sensory uncertainty, and the flexibility to deal with novel environments and scenarios. Animals solve these problems through having flexible brains capable of unsupervised pattern detection and learning. Even 'small'-brained animals like bees exhibit sophisticated learning and navigation abilities using very efficient brains of only up to 1 million neurons, 100,000 times fewer than in a human brain. Crucially, these mini-brains nevertheless support high levels of multi-tasking and they are adaptable, within the lifetime of an individual, to completely novel scenarios; this is in marked contrast to typical control engineering solutions. This project will fuse computational and experimental neuroscience to develop a ground-breaking new class of highly efficient 'brain on board' robot controllers, able to exhibit adaptive behaviour while running on powerful yet lightweight General-Purpose Graphics Processing Unit hardware, now emerging for the mobile devices market. This will be demonstrated via autonomous and adaptive control of a flying robot, using an on-board computational simulation of the bee's neural circuits; an unprecedented achievement representing a step-change in robotics technology.