Log in
Enquire now
Sensory substitution and augmentation

Sensory substitution and augmentation

Sensory substitution refers to the use of one sensory modality to supply environmental information normally gathered by another sense while still preserving some of the key functions of the original sense.

OverviewStructured DataIssuesContributors

Contents

Is a
Technology
Technology

Technology attributes

Created/Discovered by
‌
Paul Bach-y-Rita
Related Industries
Medical device
Medical device
Neurology
Neurology
Healthcare
Healthcare
Related Organization
Wicab, Inc
Wicab, Inc
NeoSensory
NeoSensory

Other attributes

Wikidata ID
Q356133

Sensory substitution refers to the use of devices to use one sensory modality to supply environmental information normally gathered by another sense. Usually this is done while still preserving some of the key functions of the original sense. For example, the use of auditory signals to give information about visual scenes. Sensory substitution requires active translation of stimulation between sensory systems, which is similar to but in contrast of synesthesia, which is an involuntary association of one sense or sensory attribute with another.

Sensory substitution systems generally consist of three parts: a sensor, a coupling system, and a stimulator. In this type of system, a sensor records stimuli and sends the signals to a coupling system which interprets these signals and transmits them to a stimulator.

The use of sensory substitution systems can help people restore the ability to perceive certain defective sensory modalities by substituting the information from a different sensory modality.

Physiology

Sensory substitution systems work on the principle of the plasticity of the human brain and human perception. In the case of a person who is blind or deaf, they do not necessarily lose the ability to hear or see, but rather lose the ability to transmit sensory signals from the periphery to the brain. In a sensory substitution system, an intact sensory modality relays information to an intact sensory system. This includes touch-to-voice substitution which transfers information from touch receptors to the visual cortex for interpretation and perception.

Through fMRI's, researchers have been able to observe and determine which parts of the brain are activated during sensory perceptions. In this way, blind persons have been observed to receive tactile information which activate their visual cortex as they are perceived to see the objects.

Brain plasticity

Plasticity, a quality which allows the adult brain to adapt, is the principle which allows users of sensory substitutions to adapt. In this way, the brain reconfigures itself to compensate for sensory damage. The principle was demonstrated in 2002 in a study which observed the ability of damaged spatial-processing centers in mice to reconfigure themselves to compensate for the sensory damage.

In the above scenario, where a blind person can receive tactile information, such as reading Braille, the fMRI's have shown the use of the occipital lobe to functionally perceive objects. This part of brain plasticity, often called cortical re-mapping or reorganization, takes place when the brain experiences some sort of deterioration. In the example, the blind person shows a cross-modal recruitment of the occipital cortex during perceptual tasks, such as Braille reading. Sensory substitution systems take advantage of this functionality of the brain.

Tactile sensory substitution
Tactile visual substitution

In tactile sensory substitution systems, touch is used to support the loss of other sensory modalities, with vision being the most commonly substituted modality. In tactile-vision substitution systems, images or video are converted into tactile receptors. The original version of this was the TVSS by Paul Bach-y-Rita which converted video or images into a tactile feedback on the persons back.

BrainPort Vision

This system later developed into the BrainPort device, developed by Wicab. The BrainPort device uses a small video camera on a users brow connected to a small postage-stamp sized device the wearer holds on their tongue. The postage-stamp device uses four hundred tiny electrodes which send shocks based on the image. The image is reduced in resolution to four-hundred gray-scale pixels to match the electrodes on the lollipop. The darker the pixels send the user a stronger shock, while lighter pixels merely tingle. The resulting vision allows users to understand what is in front of them.

Image of a BrainPort device. (credit: frontiers.org)

Image of a BrainPort device. (credit: frontiers.org)

In the BrainPort system, the use the tongue as a receptor is ideal because of the tongues sensitivity and the reduction in voltage that allows. Whereas the use of an area of skin for tactile feedback comes into the problem of the high impedance of dry skin which requires larger power draws.

At the Arizona State University's Center for Cognitive Ubiquitous Computing, researchers have developed haptic belts and haptic gloves to help enable people who are blind to perceive situational information. The main research is into designing haptic emotions which are distinct and intuitive in order to communicate information and facial expressions.

Image of Arizona State Univeristy's VibroGlove. (credit: cubic.asu.edu)

Image of Arizona State Univeristy's VibroGlove. (credit: cubic.asu.edu)

Tactile auditory substitution

David Eagleman developed a new device for sound-to-touch hearing which he presented at TED in 2015. The research laboratory was expanded into the company NeoSensory, which began to develop devices capturing sound and turning them into patterns of touch on the skin. The technology is based on the brain transforming the stimuli into signals the brain can interpret the data. The vest David Eagleman has designed could be used to help deaf persons parse auditory data in a way of transferring the inner ear to the skin. David Eagleman and NeoSensory are also working to develop new ways of experiencing data for humans.

David Eagleman's TED 2015 talk around the haptic feedback vest.

Tactile vestibular substitution

Sensory substitution systems have been used to help some people with balance disorders or adverse reactions to to antibiotics and suffer from vestibular damage. Wicab's BrainPort Balance Plus device is an example of a commercially available system. The substitution system provides sensory information about the position of a persons body through electrical impulses delivered through a small tab held against the tongue. These electrical impulses help the user understand which way they are leaning, and have been used to help improve and retain improved balance for patients.

Tactile to tactile substitution

Used to help restore periphery touch sensation, tactile to tactile sensory substitution systems were demonstrated by Paul Bach-y-Rita where touch was restored in a patient who lost peripheral touch sensation from leprosy. In the case, the patient was equipped with a glove containing artificial contact sensors coupled to skin sensory receptors on the forehead. After training, the patient was able to experience data from the glove as if it originated from the hand.

Tactile feedback for prosthetic limbs

Similar to tactile sensory substitution systems for the loss of peripheral tactile sense, there have been development of technologies to provide patients with prosthetic arms with tactile and kinesthetic sensibilities. The system uses similar principles of other sensory substitution systems and uses tactile feedback methods to restore the perception of touch to amputees through direct or micro stimulation of the tactile nerve afferents.

Tactile sensory substitution systems

Device name
Description
Related organization

BrainPort Balance Plus

oral electo-tactile stimulation for improving balance, posture, and gait.

Wicab, Inc.

BrainPort Vision Pro

oral electronic vision aid providing electro-tactile stimulation for mobility, orientation, and object recognition.

Wicab, Inc.

Haptic Belt

A vibrotactile belt providing situation information for the visually impaired

Arizona State University Center for Cognitive Ubiquitous Computing

naviBelt

A vibrotactile belt helping sighted and visually impaired persons navigate

feelSpace

Neosensory Buzz

A wrist-worn tactile feedback device for translating sound into vibrational patterns

NeoSensory

Auditory sensory substitution

Auditory sensory substitution systems aim to use auditory signals to compensate for the lack of other sensory modalities. These devices translate visual or tactile sensors data into auditory signals to relay via auditory receptors to the brain.

Auditory visual substitution

In auditory visual substitution, the systems work to provide synthetic vision using sound. Research into neural plasticity has shown the visual cortex of adult blind people can already become responsive to sound and see with sound. These systems use a live video from a head mounted camera encoded in sound. Auditory visual substitution systems work to solve the problem of space and spatial tasks found in tactical sensory substitution for vision. They do this through the use of stereo sound and spatial sound.

The vOICe Device

A version of auditory visual substitution, the vOICe device translate vertical position to frequency, left-right position to scan time, and brightness to loudness. These systems have exhibited in both blindfolded sighted and blind participants the ability for participants can recognize and localize artificial objects after training.

Visualization of the vOICe encoding system.

Visualization of the vOICe encoding system.

EyeMusic

Built on the same idea as the vOICe device, Amir Amedi's lab at the Hebrew University of Jerusalem have been developing the EyeMusic device which uses bone-conductance headphones, to create a musical image. Rather than the pixelated sounds used by the vOICe device, the EyeMusic device uses jazz music. The device lets users hear musical sounds as they scan pictures from left to right. The pitch corresponds to height, where high notes represent high pixels, middle notes are middle pixels, and low notes represent low pixels. Amplitude corresponds to brightness, such that louder equals brighter. And a different musical instrument represents each color. The EyeMusic system works to produce sounds based on color to provide a video-to-audio sensory substitution system that offers something more than gray-scale imaging.

Example image of the EyeMusic device developed by Amir Amedi's Lab at the Hebrew University of Jerusalem.

Example image of the EyeMusic device developed by Amir Amedi's Lab at the Hebrew University of Jerusalem.

The KromoPhone

Similar to the EyeMusic and the vOICe systems for visual to auditory sensory substitution systems, the KromoPhone offers three modes of color sonification: RGB, HSL, and RGBYW. The system associates the amplitude of pre-selected pure tones to the level of each color channel. Another difference offered by KromoPhone is the scan used to encode spatial information is not a passive left-to-right horizontal scan used by the vOICe and EyeMusic but an active scan controlled by the user. And the resulting spatial information is implicitly transmitted by dynamic proprioceptive information.

SeeColOr

SeeColOr applies image filters to a stereoscopic colored video with a hue-saturation-luminance color format. This uses pre-defined hue ranges used to associate musical instruments depending on the color of the pixels. The pitch encodes the saturation and a second musical sound is added depending on the luminance. The SeeColOr system is intended to add color for visual-to-audio sensory substitution systems.

PSVA

PSVA, or Prosthesis Substituting Vision for Audition, is another visual-to-auditory sensory substitution device. The system uses a head-mounted camera that works to translate real-time visual patterns into sound. Visual stimuli are transduced into auditory stimuli with the use of a system that uses pixel to frequency relationship to couple a rough model of the human retina with an inverse model of the cochlea. In the PSVA system, sounds are not generated through left-to-right scans but can be generated while staying still.

LibreAudioView

Developed by Maxime Ambard at LEAD at the Université de Bourgogne the LibreAudioView system is a visuo-auditory sensory substitution device which translates visual information into audio signals. They aim of the system is to help visually impaired people in daily tasks requiring spatial localization capabilities. In the LibreAudioView system, horizontal position is encoded using stereo panning, where the sound with larger amplitude is on the right channel. And vertical position is transmitted using pitch variation, where a more elevated position is represented by a higher tone.

TheVibe

An open-source project, TheVibe system associates a specific tone to each region of an acquired image. The conception of the sonification scheme used by TheVibe and the associated coupling of sensors and receptors allows for more complex sonification. And, similar to the PSVA system, TheVibe is capable of generating on-line whole gray-scale images without horizontal scans.

SeeHear Chip

Developed by a team from Caltech, the SeeHear system is a type of video-to-auditory system intended to map visual signals from moving objects into auditory signals that are projected through earphones to a listener. The sensation is intended to mimic echolocation, in that the listener should experience moving objects as if they were emitting sound. The auditory signals, with sound cues from the environment, are intended to help blind persons develop a more complete internal model of their surroundings.

Auditory substitution systems

Device name
Description
Related organization

EyeMusic

A visual-to-auditory system using jazz music for creating soundscapes to help the visually impaired

Hebrew University of Jerusalem

LibreAudioView

A sonification method to assist pedestrian locomotion for the visually impaired

Université de Bourgogne

Prosthesis for Substitution of Vision by Audition

A visual-to-auditory system for assisting the visually impaired

SeeingWithSound

The vOICe

A visual-to-auditory system for assisting the visually impaired navigate

SeeingWithSound

theVibe

Software to convert real-time video into an audio stream

S. Hanneton

Other Systems
Drive Grip

Developed by a research team led by Dr. Dennis Hong out of the Robotics and Mechanisms Laboratory at Virginia Tech, the Drive Grip system is a touch as a sensory substitute for vision to allow blind persons to drive. In this case, the car uses lasers to collect information about obstacles and boundaries on the road. This information is relayed to the driver through specialized glovers which vibrate on the fingertips of each hand to indicate the direction the car should be turned. The chair of the driver also vibrates to offer guidance on the optimal speed with the intensity and placement of the vibration informing the driver whether they should slow down, speed up, or come to an emergency stop. And the system uses puffs of air on the drivers fingertips to allow the driver to make advanced decisions and offer greater independence over the automated feedback of the car.

A graduate student using the Drive Grip system

A graduate student using the Drive Grip system

Project Cyborg

Kevin Warwick and his team at the department of Cybernetics, University of Reading, have been working on merging silicon chip transponders on the human nervous system for monitoring an individual, and their emotions, and whether such a silicon chip could help the brain process information through the nerve branches. As well, the research team has done work into the silicon chip carrying personal information about a person, including banking and credit card information, insurance information, blood type, medical records, with the data being updatable as and when possible.

Sensory Augmentation

Building on the research and experiments conducted on sensory substitution systems, the possibility of augmenting the body's sensory apparatus are also being developed. The intention of these projects is to extend the body's ability to sense aspects of the environment not normally perceivable by the body in its natural state.

HearSpace project

The HearSpace project, from the University of Paris, builds on sensory substitution methods of using stimulus to supplement the human senses. In the project, the team uses distal sounds to develop a users perception of self-rotation and help those users perceive their vestibular self-rotation even when the HearSpace device is absent.

The project was able to help users orient themselves to magnetic North through the integration of sensors (such as compass, gyros, and accelerometers) into headphones and replicating them as spatial sound. The research team used the sound of a waterfall to represent magnetic North.

E-sense project

The e-sense project of the Open University and Edinburgh University, is a speculative and research project into the use of wearables and prototyping sensory augmentation devices. The devices include a minimal tactile vision sensory substitution system similar to the TVSS developed by Paul Bach-y-Rita. They have developed a MusicJacket, which is a system for training violin players using motion capture and vibrotactile feedback. The project developed a Haptic Drum Kit to help teach drummers polyphonic rhythms. And they have developed Locusts, which is a multitouch table with vibrotactile feedback for workspace awareness.

feelSpace

Developed at the University of Osnabrueck, the team developed a vibrating compass belt which worked as an artificial sensory organ helping the wearer know where North was at all times. This project was used to develop their naviBelt which can be used to help sighted users and blind persons to help navigate and understand the feeling of space.

Sensory augmentation systems

Device name
Description
Related organization

Haptic Drum Kit

A system for teaching drummers polyphonic rhythms

e-sense

Locusts

multitouch systems for tables offering vibrotactile feedback for workspace awareness

e-sense

MusicJacket

A vibrotactile feedback system for training violin players.

e-sense

Project Cyborg 1.0

Project into silicon chip implants for local electronics control

Kevin Warwick

Project Cyborg 2.0

Project into the merging of silicon chips on the nervous system for the control of electronic devices

Kevin Warwick

History

In 1969, Paul Bach-y-Rita, who had begun his medical career in visual rehabilitation and gained a reputation as a specialist in the neurophysiology of the eye muscles, completed his first prototype of a sensory substitution device. The impetus for the development of this device came from his father's ordeal.

Suffering a debilitating stroke in 1958 which left him wheelchair bound and barely able to speak or move, Pedro Bach-y-Rita was able to recover with the help from his son George Bach-y-Rita, and two years later had returned to his job working as a teacher. Pedro Bach-y-Rita would die from a heart attack while hiking a few years later. When an autopsy was done on Pedro Bach-y-Rita by a neuropathologist, they discovered the parts of his brain responsible for motion and involuntary muscle movements were still severely damaged, despite his recoveries.

The device Paul Bach-y-Rita went on to develop was a large dentist's chair, an old TV camera, and a grid of vibrating, Teflon-tipped pins mounted on the chair. This sensory substitution device, the TVSS, converted analog video into vibration and allowed users to feel pictures on their back. In initial studies, with a few hours practice, users who were blind from birth were able to distinguish between straight lines and curved ones and identify objects such as telephones and coffee mugs.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date

Auditory Sensory Substitution is Intuitive and Automatic with Texture Stimuli

Noelle R. B. Stiles, Shinsuke Shimojo

https://www.nature.com/articles/srep15628

Web

2015

Blind Sight: The Next Generation of Sensory Substitution Technology

Dana Smith

https://www.discovermagazine.com/health/blind-sight-the-next-generation-of-sensory-substitution-technology

Web

April 28, 2014

Mobile video-to-audio transducer and motion detection for sensory substitution

Maxime Ambard, Yannick Benezeth, Philippe Pfister

https://www.frontiersin.org/articles/10.3389/fict.2015.00020/full

Web

October 6, 2015

Seeing with Your Tongue

Nicola Twilley

https://www.newyorker.com/magazine/2017/05/15/seeing-with-your-tongue

Web

May 8, 2017

Sensory augmentation: integration of an auditory compass signal into human perception of space

Frank Schumann, J. Kevin O'Regan

https://www.nature.com/articles/srep42197

Web

February 14, 2017

References

Find more entities like Sensory substitution and augmentation

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.