Using harmonies and chords to go beyond visuals and enhance the user experience
We leverage the sounds of our morning alarm or the distinct beep of an unlocked car to communicate with our technology. Yet there’s an assumption that user interfaces communicate mostly through a screen, overlooking the power of sonic language.
“Although hearing is one of our primary senses, most interfaces today are primarily visual.”
I’ve been a designer for 16 years and I compose music as a hobby. These two skill sets have helped me reinforce the idea that user interfaces are meant to incorporate at least sound and vision. In my current job as a UX Designer at Udemy, my team has been working on a revamp of our learning experience. In a brainstorm session, the concept of incorporating sound in the interstitial screens of a course was surfaced. Excited, I started playing around with some synths and midi samples to create auditory feedback on lecture progress and completion. We experimented with different instruments, chords, and tempo. The challenge was to use audio to meaningfully illustrate progress while representing our values. What sound represents us? We ended up with some short and subtle motifs using a marimba and a harp in A Major.
This experience left me wondering… what if instead of using beeps and zings as auditory feedback on interfaces, we applied harmonies, notes, or chord progressions as symbolic sounds? What if we chose an instrument or set of instruments that speak to our brand and reflect the voice of our product? What if music was used in such a way that the user intuitively understood its underlying message?
“What if music was used in such a way that the user intuitively understood its underlying message?”
Although hearing is one of our primary senses, most interfaces today are primarily visual. Sound feedback can enhance user interactions, yet there is a reliance almost entirely on how things appear on the screen. Auditory feedback aids the user by enabling them to look away from the device to complete multiple tasks. This feedback is also helpful by demonstrating that an action has been registered, is in progress, or has been completed without the use of a screen. Designing with audio is not easy, though. There are many aspects to consider if you want to keep your experience pleasant, meaningful, and practical.
“Auditory feedback aids the user by enabling them to look away from the device to complete multiple tasks.”
I enjoyed the experience so much that I decided to compile a collection of musical sounds that others could implement in their productions. I ended up making over 200 audio samples of harmonies, sequences, SFX, speech, and chord progressions on 8 different instruments.
You can download the full pack here . But if you want to know a bit more about my background, my recommendations on how to design musical interfaces, and my process for creating these sounds, then keep on reading!
If a tree falls in a forest, do I get a sound notification?
Before talking about music, let’s start with how we interpret and eventually develop meaning behind sound. Non-speech audio contains rich information that helps us understand our environment, a process that has become part of our everyday experience. Just by listening, we can determine when the batter hits the ball, when the velcro is detached, or when the teapot is ready. We’ve been using audio as a feedback mechanism in devices such as TVs, microwaves, cars, toys, and mobile phones. Auditory interfaces can function as useful and pleasant complements (or even substitutes with the rise of wearables) for visual interfaces.
When designing with audio, it is important to define the specific meaning of each sound at an early stage in the process. A sound that communicates important information should be significantly different from one that serves as a complement to the visuals. Since sound is fundamentally different from vision, it can carry information that the latter cannot. Sound reinforces the first three principles of interaction design: visibility (affordances or signifiers), feedback, and consistency in a unique way.
Auditory designs can be used to display patterns, changes in time, calls to action, notifications, or warnings. The possibilities are limitless, but that doesn’t mean every interaction needs to include sound. Audio should enhance the experience, not interfere or distract. So users don’t get annoyed by repetitive sounds, it’s best practice to apply short and simple sounds that are informative by their form alone. That way the audio contains a meaning built into itself.
Design and music are so in tune
While design is my primary passion, music has always had a special place in my heart. My musical background is not the most traditional, yet it’s pretty cliché: I started (horribly) playing the guitar with a punk band as a teenager, then transitioned to synth-punk with midis and DAWs, and then worked my way into nu-disco with synths and arpeggiators (James Murphy would shake his head). After “wooing” listeners with música sabrosa in a cumbia band, I decided to explore the “lost art” of DJing ( Mexican weddings are my specialty).
Throughout all my years as a designer and an amateur composer, I have discovered that the mapping of these creative processes is not so different. When composing a song, writing a comic, or designing an experience, your objective is to tell a story. You follow a basic structure: exposition, rising action, climax, falling action, and resolution. It’s all about taking your audience on a ride.
The similarities don’t stop in the structure. The dimensions of sound (pitch, timbre, duration, loudness, direction) are analogous to the elements of design (shape, color, size, texture, direction). And the principles of both music and design (composition, form, rhythm, texture, harmony, similarity/contrast) share similarities too.
Why am I telling you this? Because I think that the sound and visuals of any interface should be homogeneous. For example, when designing a warning module, we might use a red color and an alert icon since these are familiar visual cues that the user would recognize as something dangerous or risky. Similarly, we could use an alert sound that is high pitched, loud, and has an unusual timbre. Visuals and audio on an interface should be related in an analogous or complementary way.
Blackberry compares visual language of a graphical UI with sounds on their Earconography :
An icon of an envelope can be different colors, have a stamp on it (or not), or be tilted 25 degrees, but as long as it still looks like an envelope, users will know what it represents. Same story for sounds.