Centre for Computational Neuroscience and Robotics - CCNR

Research

Current research projects in the CCNR.

EPSRC logo Green Brain project. In this project we are embarking on the ambitious goal of building a full-size model of the olfactory (Sussex) and visual (Sheffield) system of the honey bee brain, implement it on GPU super-computers and controlling an autonomous flying robot with it. The research is in close collaboration between Sussex and Sheffield and with our experimental collaborator, Prof. Martin Giurfa in Toulouse. Dr Thomas Nowotny

Insight logoFP7

INSIGHT: Darwinian Neurodynamics. This project explores the Neural Replicator Hypothesis (NRH) which postulates Darwinian neurodynamics within the brain itself. The project investigates the NRH in three ways: (1) development of formal models and computer simulations; (2) attempts to find empirical evidence for Darwinian neurodynamics at two different levels: that of adaptation at the neuronal level, and that of adaptation at the human problem solving level; (3) Its ICT application potential is tested for two critical domains: robotics and language communication. CCNR is involved in (1) and (2) through computer modelling and neurophysiological investigations.Partners: Parmenides, Sussex, EPFL, Universitat Pompeu Fabra, QMUL. Prof Phil Husbands,  Dr Andrew Philippides, Dr Kevin Staras

Luc BerthouzeCriticality in brain. The notion that the brain may operate in a critical regime is receiving much attention. The focus of our group is on (a) developing mathematical methods for robustly assessing the presence of criticality in electrophysiological data, particularly in the time domain; (b) characterising the presence of long-range temporal correlations in brain activity, particularly during development; (c) assessing the functional / developmental relevance of criticality. Dr Luc Berthouze 

EPSRC logo

Insect-inspired algorithms for autonomous visual route navigation. Insects navigate in a procedural way; I.e they know what to do, not necessarily where they are. The simplest procedural way to use visual information is to let a panoramic view define a movement direction. We are investigating algorithms that can guide routes by simply searching for views that are familiar. Dr Andrew Philippides

BBSRC logo

How do low resolution eyes encode natural panoramic scenes? Our aim is to examine in the lab and field the ways in which ants use, encode and recognise natural scenes. Despite its importance for navigation, little is known about how any animal encodes and identifies a natural scene. Insects with their low resolution eyes and small brains are likely to have efficient ways of encoding scenes. Dr Paul Graham

CSIROMachine learning for artificial olfaction and chemical sensing. In this research project we are investigating how machine learning techniques can fruitfully be applied in the area of artificial olfaction, in particular in two applications: a) The design of an optimal sensor array formed of different types of metal oxide sensors for a general chemical discrimination task. b) The design of an optimal instrument based on biological sensors from the fruit fly Drosophila melanogaster for applications in threat detection and wine making. Dr Thomas Nowotny