a balanced memory network一个平衡的记忆网络.pdf
文本预览下载声明
A Balanced Memory Network
*
Yasser Roudi , Peter E. Latham
Gatsby Computational Neuroscience Unit, University College London, London, United Kingdom
A fundamental problem in neuroscience is understanding how working memory—the ability to store information at
intermediate timescales, like tens of seconds—is implemented in realistic neuronal networks. The most likely
candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically.
Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there
are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed
experimentally during working memory tasks? And second, how many memories can be stored under biologically
realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and
excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks.
From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is
large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales
with the number of excitatory connections, a result that has been suggested for simple models but never shown for
realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons.
Citation: Roudi Y, Latham PE (2007) A balanced memory network. PLoS Comput Biol 3(9): e141. doi:10.1371/journal.pcbi.0030141
Introduction second, building a Hopfield-type network with low firing rate,
was solved
显示全部