Rendering localized spatial audio in a virtual auditory space
Title | Rendering localized spatial audio in a virtual auditory space |
Publication Type | Journal Articles |
Year of Publication | 2004 |
Authors | Zotkin DN, Duraiswami R, Davis LS |
Journal | Multimedia, IEEE Transactions on |
Volume | 6 |
Issue | 4 |
Pagination | 553 - 564 |
Date Published | 2004/08// |
ISBN Number | 1520-9210 |
Keywords | (computer, 3-D, audio, audio;, auditory, augmented, data, environments;, functions;, graphics);, Head, interfaces;, perceptual, processing;, reality, reality;, related, rendering, rendering;, scene, signal, sonification;, spaces;, spatial, transfer, user, virtual |
Abstract | High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware. |
DOI | 10.1109/TMM.2004.827516 |