Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes) Main Page Pixel1.gif (51 bytes)
About DSP Laboratory
Pixel.gif (52 bytes)
Contact Us
Pixel.gif (52 bytes)
Go to FIU's Homepage


 Pixel1.gif (51 bytes)


Curve.gif (104 bytes) Pixel1.gif (51 bytes)


2D- Auditory Mapping of Virtual Environment Using Real-Time Customized Head Related Transfer Functions


2D- Auditory Mapping of Virtual Environment Using Real-Time Customized Head Related Transfer Functions

by Carlos Ordonez



The human brain is capable of localizing sounds from events occurring in its surroundings. It is able to identify and pinpoint from which directions the sounds come. This project will take advantage of that ability to create a simple auditory map of a virtual environment with the use of Head Related Transfer Functions (HRTF), which are the basis  of the most prominent techniques for digital sound spatialization. In this project a synthetic virtual environment will be represented by a maze of vertical and horizontal walls drawn on the computer screen using Matlab’s Graphic User Interface (GUI) components. In this virtual environment the subject will be represented by the computer cursor. The objective is to have a blindfolded experimental subject traverse as much as possible of this virtual maze, moving the screen cursor by means of the arrow keys, without stumbling with its walls. 

Instead of the visual display of the maze, the subject will be guided by 4 spatialized sounds continuously simulated by the system, appearing to originate at the front, the back, the left and the right of the subject. These four spatialized sounds will be delivered to the experimental subject through headphones and their intensity will indicate the closeness of the nearest object in the virtual environment, with respect to the virtual representation of the subject, i. e., the current position of the screen cursor, in the four corresponding directions. To achieve this effect, the system will continuously determine the virtual distance of the screen cursor to the nearest wall of the maze in each of the up, down, left and right directions and it will modify the intensity of the spatialized sound coming from the top, back, left and right of the listener, respectively. The intensity of the spatialized sound in each direction will be proportional to the inverse of the distance to the nearest object in the virtual environment. This process of real-time dynamic spatialization of the four sounds will utilize the individual HRTFs of each experimental subject, measured prior to the virtual navigation test.

The aim of this project is the evaluation of the efficiency of navigation of a “blind” individual in an environment that is dynamically mapped by continuous range measurements in pre-determined directions, which are portrayed to the subject through the superposition of multiple spatialized sounds.



The individual’s HRTFs are obtained using AUSIM’s 3D-sound measurement equipment. The data collected for 0, 90, 180, and 270 degrees of azimuth, at 0 elevation,   through this system are uploaded into Matlab. A GUI is created to represent a virtual maze that the subject must navigate, updating his/her virtual location through steps executed with the arrow keys of the computer. The system will evaluate the navigation efficiency of each subject by keeping track of the total distance traversed by the subject in the virtual maze and the number of times the cursor “stumbled” into a virtual wall.


Testing Implementation:

In order to test this project the objects in the maze will warn the user when he or she passes the pointer over the object “hitting” the wall. The number of warnings will be counted to determine the efficiency of navigation during each test.



The system will be implemented on a Personal Computer, and its development will be based on the Matlab environment.