Toggle Main Menu Toggle Search

Open Access padlockePrints

Resource allocation models of auditory working memory

Lookup NU author(s): Sundeep Teki, Dr Sukhbinder Kumar, Professor Tim GriffithsORCiD

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

Auditory working memory (WM) is the cognitive faculty that allows us to actively hold and manipulate sounds in mind over short periods of time. We develop here a particular perspective on WM for non-verbal, auditory objects as well as for time based on the consideration of possible parallels to visual WM. In vision, there has been a vigorous debate on whether WM capacity is limited to a fixed number of items or whether it represents a limited resource that can be allocated flexibly across items. Resource allocation models predict that the precision with which an item is represented decreases as a function of total number of items maintained in WM because a limited resource is shared among stored objects. We consider here auditory work on sequentially presented objects of different pitch as well as time intervals from the perspective of dynamic resource allocation. We consider whether the working memory resource might be determined by perceptual features such as pitch or timbre, or bound objects comprising multiple features, and we speculate on brain substrates for these behavioural models. This article is part of a Special Issue entitled SI: Auditory working memory. (c) 2016 Elsevier B.V. All rights reserved.


Publication metadata

Author(s): Joseph S, Teki S, Kumar S, Husain M, Griffiths TD

Publication type: Review

Publication status: Published

Journal: Brain Research

Year: 2016

Volume: 1640

Issue: Part B

Pages: 183-192

Print publication date: 01/06/2016

Online publication date: 02/02/2016

Acceptance date: 25/01/2016

ISSN (print): 0006-8993

ISSN (electronic): 1872-6240

Publisher: ELSEVIER SCIENCE BV

URL: http://dx.doi.org/10.1016/j.brainres.2016.01.044

DOI: 10.1016/j.brainres.2016.01.044


Share