Speech Development in the deaf

See Sound Live is a mobile app that helps a deaf sign-language user develop speech

About

Sign languages have been the native language for deaf communities across the world, and are an important tool for expressive and receptive communication for the user. However, as with any other language, the communication is limited to users of the sign language. See Sound Live started as an idea to provide an additional channel of communication for the deaf, and bridging the communication divide they face when engaging with those who do not know how to sign.

With See Sound Live, a deaf person can learn to speak by harnessing their heightened visual senses. They can start by speaking simple sounds like “aa”, “ee”, “oo”, and then eventually speak out bigger sounds and words. With regular learning and practice on the See Sound Live App, a deaf person can build a vocabulary of spoken sounds and words like Mama, Papa, Hello, Bye, and more!

How it works

Feedback is the most important aspect of learning anything – it is how we know to ‘course-correct’, and what to change to get it right. This applies to speaking as well. A deaf person may be capable of producing sounds. But when they try to speak, they are not able to get the auditory feedback, and so they don’t know if they are making the right sounds. This missing feedback stops them from developing clear speech.
On the See Sound Live app, we provide this feedback via visual cues. When a deaf person speaks into the app, the See Sound algorithm creates a visual equivalent for the sound, seen as a pattern of vivid colours. These patterns are markedly different for a different sound, and remain the same if the same sound is repeated. So, the brain is getting this valuable feedback on the spoken sound – but instead of the ear, it is through the eyes!

The See Sound Live app provides a structured curriculum to learn the basic sounds, along with a reference of the visual equivalent – the colourful pattern – that ought to be generated for each sound when spoken correctly. The brain’s visual processing center uses the colourful patterns as a feedback to evaluate the sound produced. So, with practice, and by following the feedback from the reference visual equivalents, a user can modulate their sounds until the brain has understood how
to speak it correctly. Over time, with regular practice, starting with simple sounds, and eventually
more complex sounds, a deaf person can develop speaking skills.

Essentially, the eyes take over from the ear, and the user is able to ‘hear’ with their eyes, or see sound. The See Sound Live algorithm has been developed with extensive research, which has demonstrated that a deaf person can learn to speak using their visual senses, much in the same way that a blind person learns to read using their sense of touch with Braille.

To explore the See Sound Live app, please click on the Download link on top of this page.

Contact us

© 4S Medical Research P Ltd. 2019 care@seesoundlive.com