Workshop on Synthetic Neuroethology            

 

University of Sussex, Brighton, UK, 9-10 September 2010

 

Talk Abstracts

 

Understanding the brain through active touch sensing in rats and robots
Tony Prescott (Sheffield University)

 

The systems approach in the brain sciences has demonstrated that there is no straightforward decomposition of the brain into modules, or even a simple means to separate the brain from the body (in control terms), or the body from the environment. So how should we proceed to understand the relationship between brain and behavior?  Our approach has been to investigate a complete sensorimotor loop, specifically, the guidance of exploratory behavior by tactile sensing signals in the rat whisker (vibrissal) system.  We investigate this system using both neuroethological methods and a synthetic approach whereby we seek to test functional hypotheses using physical (robotic) models.  The first part of this talk will review neurobehavioral experiments that show a tight coupling between vibrissal sensing signals in the rat and active control of the movement and positioning of the whiskers.  The second half will introduce the SCRATCHbot robotic platform, a biomimetic robot equipped with bilateral arrays of actuated artificial whiskers, that is controlled by models of relevant neural circuits. I will describe how experiments using this platform are providing insights into the sensory modulation of whisking pattern generation; the role of the superior colliculus in orienting to tactile stimuli; and a possible role for the cerebellum in cancelling self-generated sensory noise.

--------------------------------------------------------------

 

Spiking Neuronal Network Model of Unsupervised Olfactory Learning on Graphical Processing Units

Thomas Nowotny (University of Sussex)

 

Parallel computing has been used for a long time, at least since the late
50s and early 60s. However, only recently has parallel computing grown
beyond the domain of expensive super-computers and entered the broader
market in the form of multi-core processors. Somewhat independently, and
less noticed by the general scientific community, graphic processing units
(GPUs) also have become powerful, highly parallel computing devices which
can now challenge the de-facto monopoly of the few big CPU manufacturers.

On this poster I will present the parallel implementation of a spiking
neuronal network model with biologically realistic morphology, elements, and
function on a GPU using the NVidia CUDA framework. The model describes a
prototypical olfactory system of converging and diverging pathways that
performs unsupervised odor recognition (clustering) using a spike timing
dependent plasticity (STDP) learning rule.

When comparing the parallel implementation of the model to a well-designed
standard C/C++ implementation I observed a 24x speedup when using an NVidia
Tesla C870 device for the CUDA implementation and a 3 GHz AMD Phenom II X4
940 processor for the classical implementation. With this speedup, the CUDA
program can run the model comprising 2670 neurons and on the order of
200,000 synapses in faster than real time.
 

-----------------------------------------------------------------

Constraints on representations from the statistics of our visual world

Roland Baddeley (Bristol University)

 

The light that is received by visual systems is the product of the surface reflectance of an object (which will be constant across viewing
position and time), and the illuminant (which can change drastically over time and viewing position). It would seem sensible for artificial
and natural systems to represent objects and locations in terms of the invariant reflectance component of the visual signal rather than the
highly variable raw signal.


Here we look at the statistics of surface reflectance, and temporal and spatial and chromatic properties of real world illuminants to see
what constraints these place on robust illumination invariant representations. We show that a large number of phenomena (such as why we
only have three lightness terms in English, to how temporal adaption occurs), can be simply understood in terms of the system attempting
to extract and represent reflectance in noisy illumination corrupted visual signals.
 

-------------------------------------------------------------------

Controlling Biomimetic Robots with Electronic Nervous Systems

Joseph Ayers (Northeastern University, US)
 

We build biomimetic robots based on simple neurobiological models, the lobster the honeybee and the sea lamprey. The robots feature a physical plant that captures the biomechanical advantages of the body form, a neuronal circuit-based controller, neuromorphic sensors, myomorphic actuators and a behavioral set based on action patterns, reverse engineered from movies of the animal models. Our controllers are based on neuronal circuits established from neurophysiology. To achieve real-time operation, we base our electronic neurons on nonlinear dynamical models of neuronal behavior rather than physiological models. We employ both UCSD electronic neurons and synapses (analog computers that solve the Hindmarsh-Rose equations) and discrete time map based neurons and synapses that are integrated on a DSP. Together these components provide an integrated architecture for the control of innate behavioral action patterns and reactive autonomy.We will illustrate this approach with a variety of platforms ranging from biohybrids to neuroprostheses and mariculture systems

 

 

--------------------------------------------------------------------

Neural mechanisms of spatial cognition

Neil Burgess (UCL)

 

Computational models and single unit recording data indicate that the neural basis of sense-of-location involves a compromise between Hippocampally mediated environmental information and Entorhinally mediated short-term path integration. In particular, the relevant environmental information appears to be the set of distances to extended geometrical features (boundaries) along specific allocentric directions, while short term path integration is supported by 'grid cells' whose firing may be generated by interference between oscillatory influences on firing in the theta band. Behavioural, neuropsychological and fMRI evidence suggests that human memory for the location is supported by similar representations in similar brain areas, in combination with egocentric representations elsewhere.
 

 

 

----------------------------------------------------------------------

Mechanisms of insect behaviour

Barbara Webb (Edinburgh University)

 

Scientific explanation often takes the form of proposing mechanisms. Taken literally, this suggests that one way to evaluate hypotheses
is to build the mechanisms and see whether and how they really account for the phenomena. We have followed this strategy in investigating
a range of different insect behaviours, including auditory localisation, escape, visually guided walking, olfactory and visual flight control;
and more recently navigation and learning. A tight interaction between experiments and modelling (if possible, carried out by the same person)
has been a particularly productive strategy, and I will illustrate this with some recent results.
 

 

------------------------------------------------------------------------

 

TBA

Paul Graham (Sussex)

 

-------------------------------------------------------------------------

 Parsimonious route learning strategies in ants: A possible role for observed scanning behaviours

Bart Baddeley (Sussex)

 

Studies of visual navigation have revealed how insects combine simple strategies to produce robust behaviour and insect navigation is now an established model system for investigations of the sensory, cognitive and behavioural strategies that enable small-brained animals to learn and utilise complex sequences of behaviour in the real world. We take a modelling approach to investigate the possible interactions between behaviour, learning and the visual ecology of route based navigation.


For an ant that can only translate in one direction relative to its body axis and has a fixed viewing direction, the direction of movement is determined by the viewing direction and visa versa. Thus, if the current retinotopic view is similar to a remembered view from a learned route, it is likely that the current viewing direction will also represent the correct direction to move in order to follow that route. We propose that if ants are able to somehow recognise familiar views, then they can recapitulate routes by simply scanning the environment and moving in the direction that is deemed most similar to the views experienced during learning.


Support for such a strategy comes from behaviours observed in both desert ants and wood ants. When released in an unexpected but familiar place the desert ant melophorus bagoti scans the environment by turning rapidly on the spot. More than one scan maybe performed with short straight runs of a few centimetres separating them before the ant finally sets off in a seemingly purposeful manner. Wood ants exhibit a second form of scanning behaviour. Instead of walking in a straight line, wood ants instead tend to weave a somewhat sinuous path. This has the effect of producing scans of the world centred on the overall direction of movement.


We provide a proof of concept for this idea by training a classifier to recognise views and learning a series of non-trivial routes through a real-world environment using a large gantry robot equipped with a panoramic camera. We also explore how route structure affects this process and discuss how this might relate to innate behaviours such as beacon aiming.
 

-------------------------------------------------------------------------

Prediction of Homing Pigeon Flight Paths using Gaussian Processes
Richard Mann (Uppsala University)

Gaussian processes are used as the mathematical framework of a probabilistic model for predicting the flight paths of individual homing
pigeons as they return home from familiar locations. Learning from past observations the model makes accurate predictions of future flight
paths. These predictions are used to demonstrate route learning, discover landmark use and understand the behaviour of pigeons released in
groups.
 

 

---------------------------------------------------------------------------

 

Embodied motion intelligence: a dialogue between insects and robots

Volker Duerr (Bielefeld University)

 

A central property of nervous systems is that they are practically useless without the body in which they developed. If this claim is true, then a functional understanding of nervous control of behaviour requires a functional understanding of the interactions between central nervous system, its body and even the environment in which the animal is situated. In my talk I will pick examples concerning three issues in legged locomotion, for which the importance of embodied information processing has been demonstrated in software or hardware models by Holk Cruse, Sven Hellbach, Josef Schmitz, Axel Schneider and myself.


(1) The first issue is redundancy: the body has many more degrees of freedom than necessary for a given task. Coordination rules for inter-leg coordination are a prominent example where robots have been taught to exploit neurobiological findings about dealing with redundancy. Today we know that some of these rules are subject to context-dependent adaptation, calling for a more flexible implementation than was done up to now.
(2) The second issue concerns limb biomechanics, particularly aspects that are due to the antagonistic organisation motor apparatus or due to passive properties of body tissue. Here I will discuss how both of these aspects simplify the control of limb movements in insects, and present some bionic solutions developed at the University of Bielefeld.
(3) The third issue concerns highly distributed, multi-modal sensory infrastructure of the insect body and its embodied nature. Here I will talk about dedicated sensory limbs that are involved in tactile near-range orientation, with emphasis on technologically relevant features. Finally, I will discuss how load sensors in a given leg can provide information about the present 'walking state' of neighbouring legs, and explain how simple, local sensory signals can aid the control of inter-leg coordination.
 


 

---------------------------------------------------------------------------

 

Modelling the modeller: towards a human-like robot with action-oriented imagination.

Owen Holland (University of Sussex)
 

Many biological and artificial systems use models of one kind or another in tasks such as motor planning, motion control, and some forms of action selection. However, humans also appear to use models in the form of simulations in activities such as imagination or episodic memory; some animals may also do this but as yet the evidence is far from convincing. In this talk I will describe an ongoing project to build a humanoid robot with a realistic human embodiment, and to equip it with the necessary mechanisms for deploying a potentially useful form of imagination. When dealing with such a robot, it is not enough for it merely to model its environment - the complexity of its body also necessitates the implicit or explicit modelling of the body in order to predict the outcomes of interactions between itself and its environment. Of course, explicit self-modelling provides fertile ground for speculation in a number of areas; the talk will mainly concentrate on technical issues, but will also attempt to draw some parallels between the implementation of the artificial system and the possible or probable implementation of the biological system.
 

---------------------------------------------------------------------------