Projects, innovation and impact

Find out more about our projects, how they're driving innovation and how they're funded.

A man fitting an EEG cap onto a woman's head.

Research projects

  • Watching the mind travel: understanding internal distraction

    This ESRC-funded project, led by Dr Sophie Forster, explored how our own spontaneous thoughts and mental images can distract us from the world around us. By developing new ways to study involuntary thoughts and their impact on attention and brain activity, the research revealed how internal distractions shape perception and focus and provided insights relevant to education, work, and mental health.

  • Identifying early markers of autism in naturalistic motor behaviour using high-frequency sampling

    Led by Professor Gillian Forrester and funded by Simons Foundation, this project tracked infants from birth to around 18 months using wearable sensors and smart toys in the home. It captured fine-grained, weekly motor behaviour data and biweekly social-communication assessments. The goal is to map early motor development, its interaction with emerging social-communication skills, and whether distinct patterns in low- and high-risk infants for Autism Spectrum Disorder (ASD) can serve as early markers for targeted intervention.

  • The mind’s eye: decoding colour experience (Project COLOURCODE)

    Led by Dr Jenny Bosten and funded by a European Research Council Starting Grant under the EU’s Horizon 2020 programme, project COLOURCODE examines how individuals perceive colour differently and explores the neural and computational mechanisms of colour experience. Through a blend of psychophysical experiments, neuroimaging and modelling, the research investigates how our brains encode and interpret colour information. This has implications for understanding human perception and potential applications in vision science and technology.

  • Calibration of colour perception to the environment (Project COLOURMIND)

    Funded by a €2 million European Research Council Consolidator Grant to Prof. Anna Franklin with Dr Jenny Bosten, project COLOURMIND explored how our visual environments shape the way we see colour. The project combined fieldwork from Ecuadorian rainforests, Arctic landscapes with lab-based psychophysics, hyperspectral imaging, fMRI, virtual reality, and infant studies. The team showed that colour perception is finely tuned to environmental colour statistics. Their findings reveal that our sense of colour adapts across timescales, from early infancy to changes in latitude and season, demonstrating that colour vision is deeply calibrated to the world we experience.

  • Predicting the future to make sense of the present: predictive brain mechanisms for speech perception

    Funded by the BIAL Foundation to Dr. Ediz Sohoglu, this research explored how the brain uses predictions to understand spoken language. It investigated the neural and cognitive mechanisms by which our brains anticipate upcoming speech sounds, enabling us to make sense of degraded or noisy speech signals. By understanding how the brain forecasts and fills in missing auditory information, the project shed light on fundamental processes of perception and may inform interventions for individuals with speech‐perception difficulties.

  • Misophonia in children: impact, co-morbidities, and roadmaps to treatment

    Funded by the REAM Foundation’s Misophonia Research Fund, this longitudinal project led by Professor Julia Simner, Dr Louisa Rinaldi and Professor Jamie Ward developed a screening tool for children and adolescents. It investigated how misophonia affects schooling, cognition, personality and well-being, situating it within broader sensory sensitivities.

  • Using thermal cameras to read stress in adults, children and primates

    Led by Professor Gilly Forrester, this project is developing a new, non contact way to measure physiological stress in both humans and non human primates using thermal imaging. Thermal cameras can detect stress in the face, and the next phase will pilot the newly validated method with infants, exploring how those with high or low likelihood of neurodevelopmental conditions respond to light, sound and touch. This research was published in PLOS ONE.

Innovation and impact

At the Sussex Centre for Sensory and Perceptual Diversity, our research doesn’t stop at discovery: it drives innovation that changes lives.

We work with partners across health, education, technology, design and the arts to translate insights about human perception and sensory experience into practical tools, policies and creative solutions. From digital technologies that support sensory accessibility, to collaborations with industry and cultural institutions that reshape how the world is designed and experienced, our work bridges science and society to promote inclusion, wellbeing and a deeper understanding of sensory diversity.

  • Smart technologies for eating behaviour and wellbeing

    Professor Martin Yeomans is leading a research collaboration developing innovative tools to understand and support healthy eating. A UKRI Consumer Lab Innovation Grant (2024–2025) and an Innovate UK Accelerated Knowledge Transfer Partnership with Emteq Labs have funded research on wearable “smart-glasses” technology to monitor real-world eating behaviour and emotional responses.

    By combining psychophysiological sensing with behavioural science, the project aims to create new interventions for weight management, improve wellbeing, and bridge academic insight with real-world impact in digital health and nutrition.

  • CrossSense: Smart-glasses to support independence fo people living with dementia

    CrossSense, developed by the UK based cooperative Animorph with Science Lead Professor Julia Simner, is an AI app for smart glasses designed for people living with dementia, helping them navigate their environment, maintain recognition, and support independence.

    The product combines lightweight AR smart glasses with an AI powered app to deliver spoken and written reminders about objects, people, and everyday tasks. The product is shortlisted for the prestigious Longitude Prize on Dementia and is one of five projects awarded £300,000 from 175 international entries from 28 different countries.

  • The Synesthesia Toolkit

    Developed by Professor Julia Simner and the Multisense Synaesthesia Research Lab, this ERC-funded project created the world’s first online Synesthesia Toolkit which is a freely available resource for parents, teachers, and professionals supporting children with synesthetic experiences.

    The toolkit includes screening tools, educational guidance, and downloadable materials to help recognise and understand synesthesia in everyday life. Its launch attracted thousands of visitors and widespread interest from schools and support services, marking a major step in making sensory diversity more visible and better understood.

  • Sensory and perceptual software development

    Dr. James Alvarez and Max Lovell create bespoke digital tools that translate research on sensation and perception into real-world applications. Through collaborations between researchers and software developers, the team creates technologies that advance understanding of sensory diversity, support accessibility, and turn experimental insights into practical benefits for education, health, and everyday life.

  • SoundSight: a mobile sensory substitution app

    SoundSight is a smartphone assistive tool designed to convert visual and thermal information, such as depth, colour, and heat, into auditory soundscapes in real-time. Users can customise the sensor input and choose how the resulting sounds are presented (for example by timbre, pitch, or spatialisation) to suit their preferences and needs. The system is designed to be low-cost, scalable, and flexible, making it a promising technology for visually-impaired users, researchers investigating sensory substitution, and developers of auditory-based interfaces.

  • ColourSpot: gamified colour vision screening

    Developed by Professor Anna Franklin and Dr. Jenny Bosten, ColourSpot is a colour-calibrated, psychophysical tablet-based game that diagnoses colour-vision deficiency (CVD) in children as young as four. The development of ColourSpot was funded by a European Research Council Proof of Concept grant and a Business Development Fellowship. Rigorous validation shows that ColourSpot achieves high diagnostic accuracy and could be scaled for use by parents, schools or opticians.

  • Infant vision, arts enagement and design

    Caregivers, early years professionals and designers require an understanding of how infants see real-world stimuli in order to optimally cater to their needs. Professor Anna Franlkin and Dr. Alice Skelton of the Sussex Baby Lab have identified how young infants see colour and other visual properties of real-world stimuli such as art, natural scenes, architecture, museum exhibits and books. Sussex Baby Lab findings have been drawn on by globally renowned designers, commerce, national institutions and early years charities. This has led to the design of award-winning infant products, new early years initiatives, and improved access for babies to the arts and their cultural heritage.

  • Inclusive attentional engagement

    Attention has been described as ‘the gateway to information processing’ - it is unsurprising, then, that differences in attention are known to substantially impact educational outcomes. Dr Sophie Forster delivered workshops training teaching staff and a local adult education college. The workshops focused on the importance of attentional engagement for inclusion of neurodivergent students, and how to fine-tune teaching materials to effectively engage attention.


You might also be interested in: