Creative Technology

Welcome to the Creative Technology research group

Our Creative Technology research is concerned with the interfaces between humans and digital technology and how these are changing. We investigate interaction in the broadest sense, considering it in relation to digital technologies, connected physical artifacts, people’s experience and their practices with mobile, immersive, ubiquitous and pervasive computing.

Highlights of the Creative Technology Research Group Launch, on 26th April

Kate Howland: Building Blocks These children’s construction blocks let young people explore ideas by active building and experimenting. My research centres on tools and methods that provide appropriate structure and support for users to engage in the design and creation of digital artefacts. Like these blocks, they must allow users to work at the appropriate granularity, and provide affordances and constraints that match the users’ expertise and aims.

Peter Cheng, Frances Aldrich and Grecia Garcia:  Knowledge Visualisation A mini-exhibition of novel graphical representations for knowledge rich eduction topics and information intensive problem solving tasks.  These include: electricity, particle collisions, school algebra, syllogistic logic, propositional logic, sets, probability, London underground transit system, production planning and scheduling, timetabling, personnel rostering.  Interactive-digital and physical-manipulative examples will be available to tryout.  

Peter Cheng and Ron Grau: Tactile Graphics, reading diagrams by touch This display shows novel designs of tactile tables and graphs for people with visual impairment. See how we use software to track and analyse individual finger movements in order to find the patterns that help us understand the reading strategies of novice and proficient tactile readers. Visitors can try it out and explore their own reading patterns, using one of our tablet computers.

Paul Newbury and Marco Gilardi: Technology Enhanced Learning, animating characters and deforming terrain. Through engaging representations of work in the field, this demo will give an overview of ongoing research in the areas of collaborative video based learning, animating avatars in virtual 3D environments and real-time deformation of 3D virtual terrains, such as a soil deformation simulation. 

Judith Good: ASCmeI.T. Although technologically very simple, ASCmeI.T. allows individuals with autism, and their parents and carers, to tell researchers what sorts of technologies they would like to see designed, thus extending the reach of participatory design to the ideation phase and giving individuals with disabilities a stronger voice in the design process.

Olivia Thorne and Judith Good: Emokno Emokno is an interactive toy robot with Natural Language capabilities: it is designed to engage in free form conversations with children with autism, sing songs, tell jokes, etc. It can also express different emotions as a result of children’s utterances to it. The aim is that children will better understand the causes of emotions through interacting with Emokno.

Marianna Obrist with Damian Ablart, Dmitrijs Dmitrenko, Elia Gatti, Emanuela Maggioni and Dario Pittera: SCHI Lab

  • Demo 1: Multisensory art experience: Vision and hearing has dominated HCI for decades. Now HCI researchers harness touch and have started to explore taste and smell for designing interactive experiences. In the Tate Sensorium project, a multisensory display at Tate Britain, we explored the effect of multisensory stimuli on visitor’s experience of art. In this demo, we present the painting “Full Stop” by John Latham, enhanced by a sound-haptic experience. 
  • Demo 2: 9D TV: Programme makers and technology manufacturers know how to design their products so you can see depth and distance on the screen, but sound and vision aren’t always enough. The question for the TV industry is what multisensory experiences should it design for – and how. Being able to smell odours that a character on screen would smell or feel the same objects can create anticipation and build suspense in the same way as sound currently does. 
  • Demo 3: In-Car Olfactory Interaction: Smells can convey driving-related information and to elicit emotions to enhance driving experience. In a driving simulator, certain bits of information are conveyed to the driver by scents, using a commercially available scent-delivery device (Aroma Shooter). The user can sit on a simple office chair.
  • Demo 4: Rubber hand illusion: The rubber hand illusion (RHI) is used in psychology - it consists in making the person embody a fake hand, demonstrating the plasticity of our body schema. In Virtual Reality, there is a need to make interaction with the environment more realistic. The VHI illusion (using a virtual hand instead of a rubber one) employs mid-air tactile feedback to set up the illusion and provide useful information for programmers and game designers who wants to program complex stimuli such as rain, fog, etc.
  • Demo 5: Scented materials modeling: Everyday life is filled by sensory stimuli conveying information to our brains. Since the first days of our life, we learn to associate environmental stimuli with feelings, ideas, and actions. This research investigates how the outside world (i.e. sensorial events) can influence the expression of our “inner world” (i.e. creativity). Smell, haptic feedback, acoustic feedback and visual feedback are engaged to influence the creativity of participants asked to create an abstract piece of art, allowing us to better understand human creativity.

Martin White, Ben Jackson, Sasithorn Rattanarungrot: Virtual museums, engaging with our cultural heritage through digital media - a service orientation approach to digital heritage installations The Reanimating Cultural Heritage repository and Santa Chiara 3D Reconstruction show how the public can experience digital cultural objects within their historical context, thereby enhancing awareness and understanding of our cultural heritage. Both are being updated to reflect changes in what technology will now allow.

Steve Huckle, Rituparna Bhattacharya, Natasha Beloff and Martin White: The Internet of Everything, Wearables, and Shared Economies This is an investigation of how technologies such as IoT, wearables and blockchains can be exploited to build shared economy applications such as enhanced shopping experience with Internet of Place, urban ecology observations with enhanced LBS, earn Green Points with loyalty rewards for sustainable urban living, monetising the playback of digital media, and smart contracts payments of on-campus energy use.

Phil Watten and Patrick Holroyd: A Mobile Acquisition Tool for Professional Filmmakers. The latest iPhones are capable devices featuring high quality cameras and powerful GPUs. They are good devices for consumer-level photography and, combined with the ultra usability and portability they offer, iPhones are on the verge of being desirable to professional filmmakers. However, the consumer-level functionally is a big barrier to adoption. This new tool uses the number-crunching capabilities of the GPU combined with a user interface specifically design for professional operators to offer the level of control filmmakers demand. 

Ann Light: Design for Sharing, building infrastructure for social, environmental and economic sustainability How might resources share themselves? How might broader sharing be encouraged? A Digital Economy study of Design for Sharing showed three distinct but interrelated aspects of sustainability that sharing finite resources, such as tools, produce, time and skills, can support.  The report has fed into new project ideas for investigating World Machines – which connect up world society as researchers, and Objects that Share - using IoT objects to support resource management.

Pollie Barden with Chloe Varelidi: FireFly Game Firefly is a tagging game played in the dark that explores temporal memory and exploits a balance of collaboration and competition. The “Fireflies” are LED badges that flash on and off at various intervals and speeds, based on natural flashing patterns real fireflies exhibit to communicate with each other. Players can only steal a badge when the light is OFF. The players track the “fireflies” while they are lit, but must find and steal them when they go dark. The core goal is to collect as many badges as possible. An additional layer is to collect the firefly badges with the same flashing pattern.

Diego Martinez: SensaBubble SensaBubble uses scented soap bubbles filled with fog to create a playful multisensory display. Images, colour or icons are projected onto the bubbles, and their smell is resealed as users pop them.

Sriram Subramanian, William Frier and team: Ultrahaptics Ultrahaptics uses focused ultrasound to create tactile sensations in mid air. This demo will allow you to experience various tactile experiences on your own hands, out of thin air.

Sriram Subramanian, Deepak Sahoo and team: Acoustic Holographic Levitation This uses phase engineered sound waves to levitate light-weight objects in mid air with a planar array of ultrasonic transducers. This demo will show manipulation of 3D position and rotation of an object, similar to a sci-fi sonic tractor beam.

Tom Hamilton, with Kate Howland and Ann Light: Digital traces for the neighbourhood Experience the participatory design process and generate ideas for ways to get people talking and interacting in public spaces such as park benches, bus stops, GP waiting rooms... By taking part in one of the activities from the Material Traces project, you will be introduced to the project’s underlying research questions about sociability and the changing needs of an ageing society

Claire Potter: Explore the seaside circular economyIt is estimated that 8 million pieces of marine litter enter our oceans every day, equivalent to a refuse truck every minute. The vast majority of this is plastic based – an oil based material that photodegrades in the sun and elements, wreaking havoc on our marine life and eco systems. But if recovered, this ‘waste’ can be seen as a very valuable resource. Come and learn about exactly what marine litter is, and what it can be transformed into…

Diane Simpson-Little: Sensory and Emotional DesignWhen we combine our sensory language and verbal language we can create unique experiences which can trigger emotional and visual responses. These responses can enrich and enliven the way we design, and as designers, a successful concept should emote all of the senses. This workshop is about understanding how our emotional responses to a sensory experience can be incorporated into the design process.

Final Year Design Students Exhibits

Monkeysaver by Chatura Fernando

Threadbear by Megan Walker


PhD Posters

Damien Ablart: Integration of Touch and Taste in Interactive Media

Dimitrijs Dimitrenko: Smell-Based Interaction

William Frier: Designing Resolution Independent Mid-Air Haptic Exploration Procedures 

Patricia Cornelio- Martinez: Investigating the Sense of Control using Haptic Technology

Dario Pittera: Full-Body Experiences Mediated by Mid-Air Haptics

Anthony Trory: Embodied Programming: Supporting the Move from Concrete to Abstract 

Head of Research Group

Professor Ann Light
Professor of Design and Creative Technology

E: ann.light[@]
T: +44 1273 876540

New Foundation Year takes off for Sept 2016