Sensation and Perception to Awareness: Leverhulme Doctoral Scholarship Programme


Applications for the 2019 Leverhulme Doctoral Scholarship Programme are now closed.

The competition will open again in October 2019 for projects starting in October 2020.
The projects listed below are for information only. New projects will be advertised once the competition re-opens.

Human-computer interaction and digital arts

Feeling human in the digital age

Supervisor: Prof Marianna Obrist,

Apply to the School of Engineering and Informatics

Interactive technologies such as Virtual and Augmented Reality (VR/AR) are transforming the ways in which people experience, interact and share information.  Advances in technology have made it possible to generate real and virtual environments with breath-taking graphics and high-fidelity audio.  however, without stimulating the other senses such as touch and smell, such experiences feel hollow and fictitious; they lack realism.  This project will transform interactions as well know them through the integration of new sensory modalities into emerging technologies.  A student working in this project should have a background in computer science/HCI with a high appreciation for psychology.  Programming skills (e.g. c++, c#, Unity) are required and basic knowledge in experimental design and statistics is desirable.  See more details on the research environment:

Sensory augmentation in physical human-robot interaction

Supervisor: Dr Yanan Li,

Apply to the School of Engineering and Informatics

Human-robot collaboration has seen increasing real-world applications, e.g. teleoperation and robot-assisted rehabilitation, by utilizing complementary capabilities of human and robot.  For example, a robot can carry out a predefined task precisely and quickly while its human partner takes correcting actions when needed as they have superior analysis and sensorimotor intelligence capabilities.  Whiles most of existing works focus on how human and robot share the control effort, recently researchers on human motor control found that human can improve their sensorimotor performance by collaborating with other humans. This interesting finding is interpreted as human estimating the motion goal of the partner in physical contact and using it to complement their own sensing when they track a common moving target.  This sensory augmentation is in line with the notion of observation-control duality in control theory but has not been studies for human-robot collaboration.  Therefore, this project will investigate whether this human-human interaction strategy can be used to improve the sensorimotor performance of human-robot interaction and how a robot can use its human partner's sensing to improve its own through physical interaction.  An excellent applicant is expected to have: communication and writing skills; background in robotics and control; analytical/mathematical skills; and programming skills.

Multi-modal approaches to augmented reality in computational arts

Supervisors: Dr Chris Kiefer,  & Dr Cecile Chevalier,

Apply to the School of Media, Film and Music

Augmented reality (AR) technology is developing at speed, and computational artists are beginning to experiment with new AR tools to create new aesthetic experiences (Chevalier and Keifer, 2018). Conversely, computational arts provide a testing ground for creating experimental scenarios that may provide new insights into the impact of AR technologies on human perception and behaviour.  Early conceptions of AR (a.g. Azuma, 1997) acknowledged the potential of rich multi-modal approaches (e.g. sound, smell, touch).  Current consumer AR technology shows a tendency towards  predominantly visual augmentation and towards the provision of information, leaving a widely underexplored area that focuses on rich multisensory perceptual mediation.  We welcome proposals exploring augmented reality's new potential in the computational arts, and finding new understanding of audience perception with AR tools.

Perception of naturalistic sounds and well-being - implications for composition of sound environments

Supervisors: Dr Alice Eldridge, & Dr Hugo Critchley,

Apply to the School of Music, Film and Media

Natural sounds have long been associated with evocation of well-being.  Empirically, natural sounds are reported to be more 'pleasant' than artificial sounds (Guastavino, 2006), and are also reported to promote measurable health benefits, including reduced pain and anxiety in health care (Chaing, 2012). Recent research helps explain these health benefits through observed alternations to autonomic activity when listening to naturalistic vs artificial sounds (Van Praag et al 2017).  The implications for the composition of every-day as well as therapeutic sound environments are significant (online, games, virtual reality, as well as TV, film and music) but under-explored.  At the same time, digital music composition methods may provide further insights into open questions: it is not clear, for example, whether this effect is due to the formal perceptual properties (harmonic, timbral) of the sounds, or wider associations of natural environments that are evoked.  Projects in this area would develop upon the work of Van Praag et al (2017) to gain deeper insight into the perceptual basis and compositional implications of this phenomenon.  Students should have a background in music as well as one of cognitive neuroscience or psychology; experience of computational methods is highly desirable. 

Student-led proposals

We welcome proposals for student-led projects in human-computer interaction (HCI), music and digital arts.  Projects on HCI may research how multisensory experiences make a difference for how we design and interact with technology in the future.  Music proposals can investigate aspects of musical performance, perception or composition that adopt strongly interdisciplinary perspectives or methods.  How might contemporary research in the senses inform multimedia art practices?  How might novel creative applications of new technologies shed light on perception of music and/or computational arts? Projects engageing with computational media and virtual and augmented reality technologies are encouraged.

Potential supervisors include: Alice Eldridge, Evelyn Ficarra; Ed Hughes, Chris Kiefer, Thor Magnusson, Marianna Obrist.

Human cognitive and computational neuroscience

Investigating the neural basis of predictive perception

Supervisor: Prof. Anil Seth,

Apply to the School of Engineering and Informatics

An influential framework for investigating perception conceives of the brain as a 'prediction machine'.  In this view, perceptual content is determined by the brain's "best guess" of the causes of sensory inputs.  This project will investigate the neural dynamics underlying (human) conscious perception through combinations of psychophysics, brain stimulation, and brain imaging. Specifically, we will develop paradigms using combined transcranial magnetic stimulation and eletroencephalography (TMS/EEG), together with psychophysics and time-series analysis, to open new windows onto the train basis of conscious perception. Students should have a background in cognitive neuroscience, psychology, engineering, or physics - with experience of computational methods highly desirable. As well as being part of the Leverhulme doctoral programme, the student will join a highly interdisciplinary research team based at the Sackler Centre for Consciousness Science.

The role of predictability in conscious awareness and memory

Supervisors: Dr Ryan Scott, & Dr Chris Bird,

Apply to the School of Psychology

In our lives we encounter situations that are familiar and predictable and others that are novel and unexpected.  Our familiarity with a situation has dramatic effects on ongoing mental processes; we tend to be more alert and aware of our surroundings in an unfamiliar situation and more prone to introspection in a familiar one (Frankenhuis et al. 2016).  But how general is this effect and what are its mechanisms? Are we better at detecting meaningful signals in an unfamiliar environment or just distracted by irrelevant details?  Are we better equipped to draw on unconscious sources and learn new information in unfamiliar situations? And are their neural correlates of the predictability of a situation?  To address these questions in naturalistic settings we will use 3-D videos and immersive virtual reality to situate people within predictable or unpredictable situations while conducting a variety of cognitive tasks.  Analyses will seek to combine behavioural responses with neural processing data collected using methods such as MEG.  The project will involve both behavioural experiments and computer-based analyses.  Students should have a background in cognitive neuroscience, psychology or informatics.

Free energy agents: understanding the information theoretic foundations of adaptive behaviour

Supervisor: Dr Christopher Buckley,

Apply to the School of Engineering and Informatics

Animals exhibit a remarkable ability to quickly and efficiently problem solve when faced with new challenges in their environment.  Humans are exceptional problem solvers but the ability to innovate and successfully manipulate the environment extends across the animal kingdom. Central to this ability are behaviours that combine actions in the pursuit of a given goal (goal driven or instrumental behaviour) with actions aimed at understanding the surrounding environment (curiosity driven or epistemic behaviour).  Theories inspired by the Bayesian brain hypothesis have become a natural way to describe this type of behaviour (Oaksford & Chater 2007).  In particular converging theory in machine learning (Houthooft et al. 2016; Schmidhuber 1991; Elfwing et al.2016) and the brain sciences (Friston et al. 2017) have suggested that the balance between instrumental and epistemic behaviour is well described by approximate Bayesian inference schemes that rest on the minimisation of an information theoretic quantity called variational free energy (Buckley et al.2017; Friston 2010).  In this project we will explore and develop the ideas inherent in this ' Free Energy principle' through theoretical and agent-based approaches to develop new architectures for autonomous robots and to account for data in human psychophysical experiments.  A good applicant will have a background in a quantitative science, experience at programming and the desire to get involved in both cutting edge machine learning and neuroscience research. 

Student-led proposals

We welcome proposals for student-led projects using methods of cognitive and computational neuroscience to investigate issues in sensation, perception, and awareness.  These methods may include but are not limited to neuroimaging, psychophysics, and behavioural studies, virtual and augmented reality, and psychophysiology.  Which computational aspects of active predictive perception shape visual awareness?  What are the neural signatures of unusual perceptual experiences?  How are internal and external sensory signals integrated to create a sense of self?  Project proposals should identify an appropriate supervisor and clearly explain the research question and proposed methodology, as well as relevance to the Leverhulme programme agenda.

Potential supervisors include: Jenny Bosten, Hugo Critchley, Zoltan Dienes, Sophie Forster, Anna Franklin, Sarah Garfinkel, Warrick Roseboom, Ryan Scott, Anil  Seth, Natasha Sigala, Julia Simner, Jamie Ward. 

Sensory Neuroscience

Characterizing sensory signalling at nanoscale and millisecond-order resolution

Supervisor: Prof. Kevin Staras,

Apply to Sussex Neuroscience

Neurons communicate by the release of neurotransmitter-containing vesicles at the synapse (Staras, Neuron 2012). Everything that we see or hear depends on the operation of synapses that are specialized to transmit sensory information.  Ribbon synapses are so-called because of a unique 'ribbon' structure that holds vesicles close to the release site and underlies their ability to sustain continuous transmission.  We are interested in how this structure contributes to the first computations in the visual pathway.  For example, the ribbon might act as a vesicle conveyor belt, or perhaps as a site to 'pre-fuse' vesicles for improved information encoding.  This project will test these fundamental hypotheses in transgenic zebrafish/mice using groundbreaking new technology (Leica-EM-ICE) that combines light-driven synaptic activation with synchronized ultrafast-freeze fixation for electron microscopy analysis (Watanabe, Nature, 2014). In this way, instantaneous snapshots of vesicle fusion are captured with ultrastructural resolution, allowing the millisecond-order events occurring during transmission to be directly visualized.  This project, which will suit a student with good technical skills and a strong interest in neuroscience and computation, is an exciting opportunity to exploit state-of-the-art methods to reveal fundamental events in sensory signalling.

How do nitric oxide synthase interneurons modulate information processing and control blood flow?

Supervisors: Dr Catherine Hall, &  Prof Miguel Maravall

Apply to the School of Psychology

Sensory processing is regulated by diverse populations of interneurons that modulate excitatory neuronal activity to shape information transmission.  Certain interneuron populations may also be key mediators of the vascular response to increased neuronal activity that matches blood flow to increases in neuronal energy requirements.  Of the different types of interneuron, the most enigmatic is, perhaps, the population that expresses neuronal nitric oxide synthase (nNOS). These cells may increase blood flow but have an unknown role in information processing not only in sensory cortex but in upstream regions such as the hippocampus.  To uncover their role in information processing, and to understand how they may help the brain to balance energy supply and demand, this project will investigate the conditions that lead to activation of nNOS neurons in sensory cortex and the hippocampus, and whether such conditions also increase local blood flow.  The project will involve imaging the activity of nNOS interneurons, excitatory neurons and blood vessels in awake behaving mice navigating a virtual reality environment, which altering the sensory, spatial or contextual (novelty, reward state) information presented to the mice.  Applicants should have a background in neuroscience or a related discipline and some experience of programming for data analysis.

How sequence recognition emerges from neuronal activity

Supervisors: Prof Miguel Maravall, 

Apply to the School of Life Sciences

The world around us is replete with stimuli that unfold over time and whose temporal structure is essential to their meaning.  A patterned stream of sounds such as music or speech, or a surface scanned by our fingertips, can only be identified by sensing the order and timing of the constituent notes, phonemes or ridges. Making sense of the world requires that we recognise and track temporal patterns in this sensory input.

Consistent with the importance of temporal patterning, humans and other animals have remarkable capacities for discriminating temporally structured sensory sequences.  Yet how the brain represents and learns sequential structure remains mysterious.  Which features of neuronal activity signal the presence of a learnt sequence?  Do single neurons respond selectively to sequence identity? This project will address these gaps by training humans and mice to recognise specific sequences (Bale et a, eLife, 2017) and recording neuronal activity in the cerebral cortex.  A student working in this project should have a background in neurobiology, cognitive neuroscience or quantitative subject (e.g., physics, computation) and be interested in combining experiments with coding and computational analysis (using Matlab or Python).

Plasticity of visual processing in primary visual cortex

Supervisor: Prof Leon Lagnado, 

Apply to the School of Life Sciences

The responses of pyramidal neurons in primary visual cortex (V1) of mice are altered under different behavioural states.  For instance, multiphoton imaging of neural calcium reporters (GCaMPs) has shown that a visual stimulus activates pyramidal neurons more strongly when the mouse is locomoting or when it is aroused compared to when it is sitting still.  This modification of the flow of excitatory signals within V1 is caused by interactives between pyramidal neurons and a diverse population of inhibitory (GABAergic) interneuron.

We will investigate how interneurons modulate responses in primary visual cortex (V1) of mice under different behavioural states and during the learning of a visual task.  Learning of a visual task alters the relation between the activity of two particular subtype of interneuron - the VIPs and SSTs - that are reciprocally connected.  In particular, we will explore the possibility that in the VIP-SST circuit, one population of interneurons can dominate over the other depending on the stimulus presented and that the strength of these reciprocal synaptic connections is modulated during learning.  This project will involve multiphoton imaging of neural and synaptic activity in the visual cortex of awake mice as they learn a visual task.

Information encoding at the first visual synapse in the visual pathway

Supervisor: Prof. Tom Baden,

Apply to the School of Life Sciences

Visual processing in the retina depends on the transformation of signals as they are transmitted across synapses (James et al. 2018 bioRxiv; Franke et al., 2017, Nature).  The first synapses in the visual pathway are structurally distinct and employ a "vesicle code" that represents information by both the rate and, surprisingly, the amplitude of events releasing neurotransmitter.  We will investigate this code at photoreceptor synapses using multiphoton imaging in the retina of transgenic zebrafish.  How does the vesicle code transmit information from cones of different colour? How is it adapted to cones sampling different parts of visual space (Zimmermann et al. 2018 Current Biology)?  How does it compare to synapses transmitting the visual signal in the inner retina? The project involves both experiments and computer-based analysis.  Students should have a background in neuroscience, engineering, physics or computing.

Elemental perceptual process in insect vision for navigation

Supervisors: Dr Jeremy Niven & Prof. Paul Graham,

Apply to the School of Life Sciences

Visual navigation, by which animals find places (e.g. home) defined by memories of a visual scene, has basic similarities across a wide range of species from ants to humans.  The behaviour depends on both identifying objects within scenes and encoding their relationships to goals and possibly to each other.  We can investigate these perceptual and learning processes in navigating ants (Lent et al., 2013, Curr Biol; Buehlmann et al., 2016 Curr Biol) where modern trackball systems allow for VR presentation of visual stimuli to freely behaving animals.  These new techniques will enable us to approach many new questions.  Two examples being whether ants memorise the three-dimensional arrangement of objects or whether attentional mechanisms allow ants to prioritise areas of the visual scene.  These kind of experiments will shed light on whether diverse animals, such as humans and ants, employ similar perceptual mechanisms to control shared behaviours.

We would like to recruit a technically inclined student with a background in neuroscience or behaviour to investigate these questions.

The role of non-synaptic interactions in odour object recognition

Supervisors: Prof. Thomas Nowotny,  and Prof. George Kemenes,

Apply to the School of Engineering and Informatics

Odorants are chemicals in the air that animals can detect with their olfactory organs to gather information about their environment.  Animals use olfactory information to find food, sexual partners, a place to deposit eggs, or to avoid predators.  In insects, odours are sensed by olfactory receptor neurons on the antennae or on the maxillary palps.  The vast majority of scientific studies, both experimental and theoretical, assumes that each olfactory receptor neuron acts on its own and that interactions between neurons only occur through synaptic connections deeper in the brain.  However, in many insects, olfactory receptor neurons of different types are co-housed in close proximity in olfactory sensilla and interact.

In this PhD you will build on ongoing efforts in our lab to investigate the nature of these interactions and their role in recognizing odor objects, i.e. the behaviourally relevant sources of smells.  The work will be a combination of electrophysiology, behavioural work and computational modelling (see e.g. Chan et al., Bioarxiv, 2017) and the particular emphasis will depend on your preferences.

Olfaction, the sense of smell, is one of the most poorly understood senses but there is exciting progress being made at the moment in this exciting research area, in particular in insects.

Putting the spotlight on taste: behavioural and optical interrogation of the neural mechanisms of hedonic taste perception

Supervisors: Dr Eisuke Koya, & Dr Hans Crombag,

Apply to the School of Psychology

Our sense of taste and ability to derive pleasure from foods are essential for survival.  Taste also guides our food preferences and aversions, reinforces appetitive learning, and adds pleasure to a mundane, but essential activity to maintain health.  On the flip side, food-evoked pleasures may drive us to excessively eat delicious, but unhealthy foods (e.g. snack foods), while avoiding bland-tasting, but healthy foods (e.g. vegetables). Hence, it is important to understand mechanisms that drive food preferences, taste perceptions, and how these determine eating patterns.  This PhD studentship will allow you to use state-of-the-art methodologies to probe the neural circuitry involved in taste hedonics in mice, focusing on interactions between the hypothalamus, nucleus accumbens and other forebrain regions (Ziminski et al. 2017; PMID: 2821344); these include fibre-photometric measurement of calcium signals and optogenetics to manipulate discrete neural pathways (Emiliani et al 2015; PMID:26468193).  Detailed analysis of consumption patterns provides us a window into the perception of hedonic rewards, to examine how physiological states (e.g. hunger), learnt changes in value, and/or levels of motivations (e.g. effort required to obtain food) might affect them.  Candidates should have a background in behavioural neuroscience (or related field) and be motivated to work independently. 

Student-led proposals

We welcome student-led project proposals that examine basic mechanisms of sensory-coding, through to the study of neural circuits and their computational principles.  New techniques such as optical (multiphoton) microscopy can study complex neural circuits in naturalistic freely-moving environments (real or virtual) in species such as mice and zebrafish.  These methods allow us to characterise, analyse and model the distributed neural circuits that allow behaving animals to process the sensory signals caused by both the environment and their own actions.  Student-led proposals could explore cross-overs from biology to robotics and AI, where machines learn to actively deploy sensory modalities of vision, taste, smell, touch and sound.

Potential supervisors include: Tom Baden, Chris Buckley, Hans Crombag, Paul Graham, Catherine Hall, Eisuke Koya, Leon Lagnado, Miguel Maravall, Thomas Nowotny, Jeremy Niven, Andy Philippides.