|
 |
[an error occurred while processing this directive]
Thompson and Gibson further observed that the aforesaid gradual transition refers to the factor β being finite-valued. If β → ∞, it corresponds to the McCulloch-Pitts regime of the neuron being classified as a logical or formal neuron. It also implicitly specifies the network of Little having a long-range order.
In the continuous/graded state transition corresponding to a moderate time-scale order, the firing pattern could be of two types: (1) The burst discharge pattern characterized by the output of an individual neuron being a series of separated bursts of activity rather than single spikes that is, the network fires a fixed pattern for some time and then suddenly changes to a different pattern which is also maintained for many time steps and (2) the quasi-reverberation pattern which corresponds to each neuron making a deterministic fire or no-fire decision at multiples of a basic unit of time; and a group of such neurons may form a closed, self-exciting loop yielding a cyclically repeating pattern called reverberation. Thompson and Gibson identified the possibility of the existence of both patterns as governed by the markovian statistics of neuronal state transition. Their further investigations on this topic [79], with relevance to Littles model, has revealed a single model neuron can produce a wide range of average output patterns including spontaneous bursting and tonic firing. Their study was also extended to two neuron activities. On the basis of their results, they conclude that Littles model produces a remarkably wide range of physically interesting average output patters
. In Littles model, the most probable behavior [of the neuronal network] is a simple consequence of the synaptic connectivity
That is, the type of each neuron and the synaptic connections are the primary properties. They determine the most likely behavior of the network. The actual output could be slightly modified or stabilized as a result of the various secondary effects [such as accommodation or postinhibitory rebound, etc.].
5.6 Hopfields Model
As observed by Little, the collective properties of a large number of interacting neurons compare to a large extent with the physical systems made from a large number of simple elements, interactions among large numbers of elementary components yielding collective phenomena such as the stable magnetic orientation and domains in a magnetic system or the vortex patterns in a fluid flow. Hence, Hopfield in 1982 [31] asked a consistent question, Do analogous collective phenomena in a system of simple interacting neurons have useful computational correlates? Also, he examined a new modeling of this old and fundamental question and showed that important computational properties do arise.
The thesis of Hopfield compares neural networks and physical systems in respect to emergent collective computational abilities. It follows the time evolution of a physical system described by a set of general coordinates, with a point in the state-space representing the instantaneous condition of the system; and, this state-space may be either continuous or discrete (as in the case of M Ising spins depicted by Little).
The input-output relationship for a neuron prescribed by Hopfield on the basis of collective properties of neural assembly has relevance to the earlier works due to Little and others which can be stated as follows [31]: Little, Shaw, and Roney have developed ideas on the collective functioning of neural nets based on on/off neurons and synchronous processing. However, in their model the relative timing of action potential spikes was central and resulted in reverberating action potential trains. Hopfields model and theirs have limited formal similarity, although there may be connections at a deeper level.
Further, considering Hopfields model, when the synaptic weight (strength of the connection) Wij is symmetric, the state changes will continue until a local minimum is reached. Hopfield took random patterns where ξiμ = ±1 with probability 1/2 and assumed Wij = [Σμ ξiμξjμ]/N, (i, j) ∈ N and allowed a sequential dynamics of the form Si(t + Δt) = Sgn [hi(t)] where Sgn(x) is the sign of x and hi = Σj WijSj and represents the postsynaptical or the local field. Hopfields dynamic is equivalent to the rate that the state of a neuron is changed, or a spin is flipped, iff the energy HN = -Σi≠j WijSiSj is lowered. That is, the Hamiltonian HN is the so-called Lyapunov function for the Hopfield dynamics which converges to a local minimum or the ground state. Or, the equations of motions for a network with symmetric connections (Wij = Wji) always lead to a convergence to stable states in which the outputs of all neurons remain constant. Thus the presumed symmetry of the network is rather essential to the relevant mathematics. However, the feasibility of the existence of such a symmetry in real neurons has rather been viewed with skepticism as discussed earlier.
|