Toggle Main Menu Toggle Search

ePrints

Segregation of complex acoustic scenes based on temporal coherence

Lookup NU author(s): Sundeep Teki, Dr Sukhbinder Kumar, Professor Tim Griffiths

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

In contrast to the complex acoustic environments we encounter everyday, most studies of auditory segregation have used relatively simple signals. Here, we synthesized a new stimulus to examine the detection of coherent patterns ('figures') from overlapping 'background' signals. In a series of experiments, we demonstrate that human listeners are remarkably sensitive to the emergence of such figures and can tolerate a variety of spectral and temporal perturbations. This robust behavior is consistent with the existence of automatic auditory segregation mechanisms that are highly sensitive to correlations across frequency and time. The observed behavior cannot be explained purely on the basis of adaptation-based models used to explain the segregation of deterministic narrowband signals. We show that the present results are consistent with the predictions of a model of auditory perceptual organization based on temporal coherence. Our data thus support a role for temporal coherence as an organizational principle underlying auditory segregation.


Publication metadata

Author(s): Kumar S; Griffiths TD; Teki S; Chait M; Shamma S

Publication type: Article

Publication status: Published

Journal: eLife

Year: 2013

Volume: 2

Print publication date: 23/07/2013

ISSN (print): 2050-084X

ISSN (electronic):

Publisher: eLife Sciences Publications Ltd.

URL: http://dx.doi.org/10.7554/eLife.00699

DOI: 10.7554/eLife.00699


Altmetrics

Altmetrics provided by Altmetric


Actions

    Link to this publication


Share