• The Listener
  • North & South
  • Noted
  • RNZ
Dame Evelyn Glennie, who feels sound. Photo/Reuters

How an Auckland lab is using tech to help deaf people experience music

Inspired by deaf multi-percussionist Dame Evelyn Glennie, the Augmented Human Lab at the Auckland Bioengineering Institute have created an innovative device.

Before losing her hearing, Marama Bowler was a typical teenager who loved listening to music. The 22-year-old has a genetic disorder that caused tumours to grow on her hearing nerves and has been completely deaf since the age of 19. However, music is still a part of Bowler’s life thanks to an innovative device she has been trying out.

MuSS-Bits (music sensory substitution bits) was invented in the Augmented Human Lab at the Auckland Bioengineering Institute by associate professor Suranga Nanayakkara and his team, inspired by the deaf multi-percussionist Dame Evelyn Glennie who feels the vibrations of sound resonating through her body.

The first time Bowler used MuSS-Bits was an emotional experience. “I bawled my eyes out,” she says. “I was shocked by my reaction because I don’t usually cry or get emotional, so it was pretty special to cry with joy.”

Using the device, Bowler now “feels” music in her car, at the gym and at home, enjoying all her old favourites by artists such as Alicia Keys, Bruno Mars and Usher. She even sings along. “It is honestly life-changing,” she says.

Nanayakkara is still exploring the potential of this new technology, which he began developing at the Singapore University of Technology and Design. His first version of it was a chair that converts music into vibrations to be felt over the whole body. That has been in use for the past 11 years in a residential deaf school in Sri Lanka.

MuSS-Bits in action. Photo/AHLAB/Supplied

“We gave them an iPod Shuffle; they load their music and they feel it,” he says. “Feeling might be different to listening, but they still have their own preferences, and if you try to trick them by playing some random noise, they clearly articulate that it’s not what they would like to hear.”

Moving his lab to the University of Auckland in March last year, Nanayakkara has taken the idea further. MuSS-Bits started out as a technology for deaf people who want to play music and be able to join in a jamming session with other musicians. It has two parts, a sensor that captures sound from a surface or digital device, and a module that lights up and vibrates with the rhythm so the sound is physically felt in real time.

“You can wear it as a smartwatch, put it in your pocket or wear it on your leg,” says the computer scientist and inventor.

Having made contact with Bowler through The Hearing House charity in Auckland, he decided to adapt MuSS-Bits into a portable listening device with the scope to help solve a few other issues she was having in her day-to-day life.

For instance, while at lunch with her and some of his team, Nanayakkara realised that the speech-recognition phone app Bowler was using to take part in their conversation had its limitations.

“She could see what was being said, but it wasn’t clear who was saying it,” he says.

Suranga Nanayakkara. Photo/Supplied

That inspired the Om Project, the aim of which is to create a smartwatch-type device that not only converts music to vibrations and light, but also has multiple microphones, giving it 360-degree hearing, so it points to whichever direction a sound is coming from. This means the user can tell not only what is being said, but also who is speaking.

“Marama told us she would like to be aware of certain sounds, such as her dog barking or the doorbell ringing,” says Nanayakkara. “So now when these sounds happen, the smartwatch vibrates and shows an icon.”

She was also worried about whether she was too loud or soft when she was speaking, so the device provides her with a visual indication of her own volume.

Nanayakkara’s lab has a team of 25 people from 19 countries – including a former Silicon Valley engineer – who are all passionate about inventing useful products. Other assistive-technology projects include the Finger Reader, which allows the visually impaired to read text through a device they wear on their fingers. And Nanayakkara is exploring the scope of artificial intelligence to help with mental-health treatments.

“We want to have a meaningful impact on people’s lives,” he says.

This article was first published in the October 12, 2019 issue of the New Zealand Listener.