Capture and rendering of spatial sound over headphones
Title | Capture and rendering of spatial sound over headphones |
Publication Type | Journal Articles |
Year of Publication | 2006 |
Authors | Duraiswami R, Zotkin DN, O'donovan A |
Journal | The Journal of the Acoustical Society of America |
Volume | 120 |
Issue | 5 |
Pagination | 3094 - 3094 |
Date Published | 2006/// |
Abstract | A theory for capturing an audio scene and then rendering it remotely over headphones is developed. The method relies on capture of a sound field up to a certain order in terms of spherical wave functions. Then, the captured sound field is convolved with the head‐related transfer function and rendered to provide presence in the auditory scene. The sound‐field representation is then transmitted to a remote location for immediate rendering or stored for later use. A system that implements the capture using a spherical array is developed and tested. Head‐related transfer functions are measured using the system described in [D.N. Zotkin et al., J. Acoustic. Soc. Am. (to appear)]. The sound renderer, coupled with the head tracker, reconstructs the acoustic field using individualized head‐related transfer functions to preserve the perceptual spatial structure of the audio scene. [Work partially supported by VA.] |
URL | http://link.aip.org/link/?JAS/120/3094/4 |