Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes) Main Page Pixel1.gif (51 bytes)
About DSP Laboratory
Pixel.gif (52 bytes)
Contact Us
Pixel.gif (52 bytes)
Go to FIU's Homepage


 Pixel1.gif (51 bytes)


Curve.gif (104 bytes) Pixel1.gif (51 bytes)

2-D Auditory Mapping of Virtual Environment Using Real-Time Head Related Transfer Functions

Pixel1.gif (51 bytes)

"2-D Auditory Mapping of Virtual Environment Using Real-Time Head Related Transfer Functions", (2002)
Carlos Ordonez, Armando Barreto and Navarun Gupta

ABSTRACT: The human brain is capable of localizing sounds from events occurring in its surroundings. It is able to identify and pinpoint from which directions the sounds come. This project takes advantage of that ability to create a simple auditory map of a virtual environment with the use of Head Related Transfer Functions (HRTFs). HRTFs are the basis of the most prominent techniques for digital sound spatialization. In this project a synthetic virtual environment will be defined by a maze of vertical and horizontal walls drawn on the computer screen using Matlab's Graphic User Interface (GUI) components. In this virtual environment the subject is represented by the computer cursor. The distances from the cursor to the nearest virtual walls are calculated continuously and used to control the intensity of four virtual sounds that are simulated as if originating at the front, back, left and right of the subject. These four spatialized sounds are delivered through headphones to a blindfolded user as he/she navigates the virtual environment stepping the cursor with the four arrow keys on the keypad of the computer. Virtual navigation tests with 14 subjects confirm that the remote navigation cues provided by the spatialized sounds are more helpful in avoiding collisions with the virtual walls than a simple warning tone provided to the blindfolded user in the immediate vicinity of the walls.