Study description
Experience visual-to-auditory sensory substitution in XR!
Have a VR (or rather AR) headset? Then head right away to The vOICe web app with your browser, give the app permission to use your camera, and check the “VR” and “large” checkboxes. However, you may also use a regular smartphone or PC to learn what it is all about.
There are no formal studies defined here, but we look forward to reports (e.g. on Twitter or in scientific papers) about how well you or your subjects can learn to perform navigation tasks, object (shape) identification tasks, as well as reaching and grasping tasks based on the sound-encoded live camera view, with soundscapes that scan the view from left to right every second while associating elevation with pitch and brightness with loudness.

Apart from its uses with near-total blindness we are interested if the auditory representation of vision may also help in rehabilitation with hemianopia and unilateral visual neglect. For that purpose you can turn on the hemianopia simulator of The vOICe web app via special URL query parameters such as in https://www.seeingwithsound.com/webvoice/webvoice.htm?hemianopiaL&vr (or use hemianopiaR for right hemifield hemianopia). Alternatively, you can toggle simulation of left (right) hemifield hemianopia by clicking or tapping the left (right) quarter of the live video area on your screen. Clicking or tapping near the center of this area instead toggles severe cataracts.
The soundscapes are not affected by the hemianopia or cataract simulation, such that normally sighted users will hear more visual detail than they can see with their eyes in areas blinded by simulated hemianopia or cataracts.

Note: The vOICe web app runs on most devices with a camera and a modern browser. However, it does not work on Oculus Quest and Oculus Rift, because these devices do not allow access to the camera frames.
Participant requirements
No requirements given.
Duration
Not stated
Compensation
Participants are uncompensated
Join study