Dr Adam Barrett
ANC, University of Edinburgh, UK
Shannon Information Capacity of Discrete Synapses
Fri May 15, 2009. 11.00am.
Memory in biological neural systems is believed to be stored in the synaptic weights. Computational models of such memory systems have been constructed in order to investigate, for example, optimal learning rules and storage capacity. Commonly, a synaptic weight in such models is represented by an unbounded, continuous real number. However, in biology, synaptic weights should take values between certain bounds. Furthermore, physiology experiments suggest that synaptic weight changes may occur in steps. In contrast to networks with continuous, unbounded synapses, in networks with bounded synapses, old memories decay automatically as they are overwritten by new ones [Parisi 1986]. Previous signal-to-noise ratio (SNR) analysis of the performance of discrete, bounded synapses has resulted in ambiguous conclusions [Fusi & Abbott 2007]. This is because altering parameters typically results in either 1) a decrease in initial SNR but a slower decay of the SNR (i.e. an increase in memory lifetime), or 2) an increase in initial SNR but a decrease in memory lifetime. In this talk I show a possible way to resolve this ambiguity by analyzing the capacity of bounded, discrete synapses in terms of Shannon information. I show how to use this framework to find optimal learning rules. I model a single neuron, and investigate how information capacity depends on the number of synapses and the number of synaptic states, both for dense and sparse coding. Below a certain critical number of synapses per neuron (comparable to numbers found in biology), we find that storage grows linearly with the number of synapses, while for more synapses the capacity grows only as the square root of the number of synapses.