|
 |
[an error occurred while processing this directive]
In proposing the analogy between a network of neurons and the statistical mechanics-based Ising spin system, Little considered that the temporal development of the network (in discrete time-steps) corresponds to a progression across one dimension in the Ising lattice. In the Ising model as indicated earlier, the spin Si at each lattice site i can take only two different orientations, up and down, denoted by Si = +1 (up) and Si = -1 (down). The analogy to a neural network is realized by identifying each spin with a neuron and associating the upward orientation Si = +1 with the active state I and the downward orientation Si = -1 with the resting state 0. Further, he suggested that certain networks of neurons could undergo a transition from a disordered state to an ordered state analogous to the Ising-lattice phase transition. Since this implies temporal correlations, he pointed out that these ordered states might be associated with memory, as well.*
*The memory, in general, can be classified on the basis of three time scales, namely:
Short-term memory: This is an image-like memory lasting from a fraction of a second to seconds. In reference to the neuronal assembly, it depicts to a specific firing pattern or a cyclic group of patterns persisting in the active states over this time period after an excitation by a strong stimulus (which will override any internal firing trend in the network, forcing it to have a specific pattern or a set of patterns. Such a stimulus would facilitate enhancement of certain neurocortical parameters associated with firing).
Intermediate memory: This could last up to hours during which time imprinting into long-term memory can be affected by drugs or electric shock, etc. The synaptic parameters facilitated still may cause the reexcitation of a pattern in the network.
Long-term memory: This refers to an almost permanent memory depicting plastic or permanent changes in synaptic strength and/or growth of new synapses, but still the facilitated parameters may enable pattern reexcitation.
Littles model is a slightly more complex description of the logical or formal neuron due to McCulloch and Pitts [7]. It accounts for the chemical transmission at synapses. Its model parameters are the synaptic potentials, the dichotomous thresholds, and a quantity β which represents the net effect on neural firing behavior of variability in synaptic transmission. In Littles model, the probability of firing ranges from 0 to 1 and is a function of the difference between the total membrane potential and the threshold. Further, the probability of each neuronal firing is such that the time evolution of a network of Littles neurons is regarded as a Markov process.
To elaborate his model, Little considered an isolated neural network. He analyzed the state of the system pictured in terms of the neurons being active or silent at a given time and looked at the evolution of such a state in discrete time-steps τ greater than the refractory period (τR) for long periods (greater than 100 τ), searching for correlations of such states. He showed that long-time correlation of the states would occur if a certain transfer matrix has approximately degenerate maximum eigenvalues. He suggested that these persistent states are associated with a short-term memory.
Little related the (three-dimensional) isolated system of M neurons quantized in discrete time-steps to the spin states of a two-dimensional Ising model of M spins with no connections between spins in the same row. The state of the neural network at an instant of time corresponds to the configuration of spins in a row of the lattice of the crystal. The state of the neurons after one time step τ corresponds to the spin configuration in the next row.
The potential φij accruing at time t to the ith neuron from its jth synapse due to the firing of the jth neuron at time (t - τ) was related by analogy to the energy of spin interactions φij between the ith and jth spins on adjacent rows only. (Unlike the Ising problem, the neural connections, however, are not symmetric.) It was assumed that probability of neuronal firing was given by an expression similar to the partition function of the spin system. The persistent (in time) states of the neural network were therefore studied in the usual approach of analyzing long-range (spatial) order in the spin systems.
A suitable product of the probabilities for firing or not firing will constitute the transition matrix elements for the neuronal state configurations at successive time intervals. As is well known in Ising model calculations, degeneracy of the maximum eigenvalues of the transition matrix is associated with condensation of the spin system below the Curie point temperature and corresponds to a new phase and long-range order. Hence, a factor (β) representing the (pseudo) temperature of the neural network appears inevitably in the transition matrix of Littles model.
Considering an isolated network or ganglion of M neurons, the potential of a neuron is determined effectively by the integrated effects of all excitatory postsynaptic potentials as well as inhibitory postsynaptic potentials received during a period of summation (which is in the order of a few milliseconds). The neurons are assumed to fire at intervals of τ which is the order of the refractory period τR, also a few milliseconds in duration. Conduction times between neurons are taken to be small. At each time interval, the neurons are assumed to start with a clear slate (each neurons potential is reset to its resting value). This corresponds to a first-order markovian process. All properties of the synaptic junctions are assumed to be constant over the time scales of interest (implying an adiabatic hypothesis).
|