Blog
Bhavesh Barve

As User Experience (UX) designers, we are usually bestowed with the responsibility of “making it usable”. We cannot ignore the aspect of usability, otherwise, it would be like having a beautiful race car without an engine!

There is always a conflict where UX designers get tied down with doing what is necessary and nothing more, making all the experiences mundane and similar. But what more can we do?

The main question is, how do we do more while still keeping usability a priority? In this article, I introduce the theory of multisensory design.

What is Multisensory Design?

As we all know, humans are multisensory organisms – it’s an essential part of our existence. For example, tasting food requires the sense of smell, taste, touch and sight to provide a complete experience. Hence why restaurants spend an immeasurable amount of money on architecture and interior design, to create a complete environment and atmosphere as well as tailor the experience for the customer.

In some shape or form, digital product design and UX laws try to replicate real-life experiences digitally. Digital designers often focus on three main senses (visual, sound and touch) which, in design implication, dials down to two main factors: the visual and the audible.

On a broader level, people consist of 4 sensory receptors/physical stimuli:

  • Photoreceptors (light).
  • Chemoreceptors (chemicals).
  • Thermoreceptors (temperature).
  • Mechanoreceptors (mechanical forces).

A combination of the information gathered from the receptors and stimuli set off operations such as vision, hearing, and smell (known as sense modalities).

There are nine sense modalities — sensations perceived after a stimulus:

  1. Vision: Our ability to see objects.
  2. Hearing: Allows us to perceive sound through our ears.
  3. Smell: The ability to detect odours.
  4. Taste: The ability to discern flavours.
  5. Touch: Physical contact that allows us to perceive materials.
  6. Pain: Sensation distress, which can be emotional or physical.
  7. Mechanoreception: The body’s perception of vibration, pressure, or other mechanical stimuli.
  8. Temperature: The skin receptors’ ability to perceive hot or cold temperatures.
  9. Interception: The detection of stimuli and sensations originating within the body.

The above-mentioned modalities have sub-senses whose plausibility can be argued upon, as some of them are considered very radical.

Biometric technology – like our all-singing, all-dancing UX biometric testing and eye tracking, GSR & facial expression analysis equipment – will give you insights into how a user’s senses respond to different digital elements.

The Similarities Between Sensory and Digital Design

Whether there are 5, 9 or 33 senses, designers prioritise sight, hearing, and touch because it’s nearly impossible to taste, smell, or feel an interface’s temperature. But what if it wasn’t?

To understand the perception of senses we need to understand that the brain sits at the centre of every sensory stimulus – according to psychologist Ladan Shams, who studies multisensory perception at the University of California, Los Angeles.

A recent discovery has brought to light that our sensory systems are a lot more mutually dependent on the brain than previously thought. For example, the sense of taste can be triggered using sound, colour, words, and haptic feedback.

Imagine creating a digital experience for a chocolate bar. How do you convey the texture of the crunchiness? What if we can combine haptics, sound, and touch to convey the consistency of the material, enhancing the experience? An interface can and should do more than just provide information.

Best Practice for Multisensory UX Design

Learn About the Senses

To create a design and technology that stimulates the user’s senses, designers need in-depth knowledge of human sensory receptors, stimuli and modalities. It’s advisable to find out more about human sensory disorders and theories to make the new design accessible and useable.

Include the Senses in Research

Designing things that appeal to the senses demands research. Based on the user base and the business, designers have to produce solutions based on in-depth analysis; deriving knowledge from relevant neurological research.

As designing for senses is something relatively new, there won’t be much data in terms of formal user research. By combining the use of sensory research and quantum biology, you can keep currently available technology in mind.

Investigate the Relationships Between Senses

Senses work in tandem, sensory integration is a process whereby the brain prioritises information from senses and surroundings to inform bodily responses. For example, the brain combines:

  • Sight and hearing to decipher communication cues.
  • Temperature and pain to prevent injury.
  • Smell and taste to trigger digestion.

In digital design, sensory pairings should be tested, as presumed relationships may produce unintended responses or usability obstacles.

Target Specific Senses

UX designers need to include research questions that investigate what triggers users’ senses and then decide which senses need to be triggered in the design to provide the essential experience. Designers also need to be aware of sensory overload, as our goal is to enhance the experience and not overwhelm the users by triggering all their senses at once.

Trigger Synesthetic Experiences

Our objective is not to modify existing devices with pheromone spraying contraptions that trigger a sense, but more to use existing digital design elements to activate modalities other than sight, hearing, and touch — even when prototypical stimuli aren’t present.

If that seems inconvincible, consider that 1 in 23 people experience a tenacious blending of perceptual pathways known as synaesthesia. Colours are tasted, music is seen, smells are touched,  and while synaesthesia is relatively uncommon, synesthetic experiences are not. A truth that has been long leveraged in marketing strategies used to uncover cross-sensory connections and metaphors.

Sensory connections are complicated, one Live Science experiment influenced people’s sight and proprioception. In other words, researchers triggered real pain and touch sensations by poking a prosthetic limb.

Test being carried out to trigger real pain and touc sensations on prosthetic limb

 

Sensory Implications of Design Elements

Whilst identifying synesthetic connections, designers should be mindful that certain design choices can have unexpected implications:

There’s no need to scratch your head over every single design element, but you must consider their potential, especially when it comes to high-impact words and graphics.

Although the sense of sight dominates our perception of design, imposing hypothetical constraints like “sight isn’t an option” will force us to explore other means to encourage creativity, and to expose how much digital products ignore different senses.

That being said, these changes might seem farfetched, but with the introduction of technologies like testable screens, haptic feedback wearables, and neural ink brain chip implants, UX designers need to start thinking beyond the realm of sight to implement these changes and leverage emerging technologies to keep ahead of the curve.

Our team of UX designers use state-of-the-art technology in our very own UX biometrics research lab, giving you access to in-depth analysis to help you improve customer experience. Interested? Get in touch to find out more.  

Credits

Micha Bowers.

Bruce Durie

Norimichi Kitagawa

Daniel Bor,a,1,2,* Nicolas Rothen,1,3,* David J. Schwartzman,1,2 Stephanie Clayton,1 and Anil K. Seth1,2

More on this subject