EarthWeb   
HomeAccount InfoLoginSearchMy ITKnowledgeFAQSitemapContact Us
     

   
  All ITKnowledge
  Source Code

  Search Tips
  Advanced Search
   
  

  

[an error occurred while processing this directive]
Previous Table of Contents Next


Hopfield has also noted that real neurons need not make synapses both of i → j and j → i, and questioned whether Wij = Wji is important vis-a-vis neuronal activity. He carried out simulations with only one ij connection, namely, Wij ≠ 0, Wji = 0, and found that without symmetry the probability of making errors increased though the algorithm continued to generate stable minima; and there was a possibility that a minimum would be only metastable and be replaced in time eventually by another minimum. The symmetric synaptic coupling of Hopfield, however provoked a great deal of criticism as being biologically unacceptable; as Toulouse [80] points out Hopfield’s strategy was a “very clever step backwards.”

In a later work, Hopfield [36] introduced electronic circuit modeling of a larger network of neurons with graded response (or sigmoidal input-output relation) depicting content-addressable memory based on the collective computational properties of two-state neurons. The relevant model facilitates the inclusion of propagation delays, jitter, and noise as observed in real neurons. The corresponding stochastic algorithm is asynchronous as the interaction of each neuron is a stochastic process taking place at a mean rate for each neuron. Hence, Hopfield’s model, in general, differs from the synchronous system of Little which might have additional collective properties.

Pursuant to the above studies on neural activity versus statistical mechanics, Ingber [67] developed an approach to elucidate the collective aspects of the neurocortical system via nonlinear-nonequilibrium statistical mechanics. In the relevant studies microscopic neural synaptic interactions consistent with anatomical observations were spatially averaged over columnar domains; and the relevant macroscopic spatial-temporal aspects were described by a Lagrangian formalism [68]. However, the topological constraints with the associated continuity relations posed by columnar domains and the Lagrangian approach are rather unrealistic.

5.7 Peretto’s Model

A more pragmatic method of analyzing neural activity via statistical physics was portrayed by Peretto [38] who considered the collective properties of neural networks by extending Hopfield’s model to Little’s model. The underlying basis for Peretto’s approach has the following considerations:

  Inasmuch as the statistical mechanics formalisms are arrived at in a Hamiltonian framework, Peretto “searches” for extensive quantities which depict the Hopfield network in the ground state as well as in noisy situations.
  Little’s model introduces a markovian structure to neural dynamics. Hence, Peretto verifies whether the corresponding evolution equation would permit (at least with certain constraints) a Hamiltonian attribution to neural activity.
  Last, Peretto considers the feasibility of comparing both models in terms of the storage capacity and associative memory properties.

The common denominator in all the aforesaid considerations as conceived by Peretto again, is the statistical mechanics and/or spin-glass analogy that portrays a parallelism between Hopfield’s network and Little’s model of a neural assembly.

Regarding the first consideration, Peretto first enumerates the rules of synthesizing the Hopfield model, namely: 1) Every neuron i is associated with a membrane potential, Vi; 2) Vi is a linear function of the states of the neuron related to i or Vi refers to the somatic summation/integration given by (with Sj = 0 or 1 according to the firing state of neuron i) and Cij is the synaptic efficiency between the (upstream) neuron j and the (downstream) neuron i; 3) a threshold level VTi will decide the state of the neuron i as Si =+1 if Vi > VTi or as Si = 0 if Vi < VTi.

Hence, Peretto develops the following Hamiltonian to depict the Hopfield neural model analogous to the Ising spin model:

where σi = (2Si - 1), (so that σi = 1 when Si = 1 and σi = 1 when Si = 0), Jij = Cij/2, hi0 = ΣjCij/2 - VTi and Jij = (Jij + Jji). In Equation (5.12), I represents the set of internal states namely, I = {σi} = {σ1, σ2,…, σM) for i = 1,2,…, M. The Hamiltonian HN is identified as an extensive parameter of the system. (It should be noted here that the concept of Hamiltonian as applied to neural networks had already been proposed by Cowan as early as in 1967 [65]. He defined a Hamiltonian to find a corresponding invariant for the dynamics of a single two-cell loop.)

Concerning the second consideration, Peretto formulates an evolution equation to depict a Markov process. The relevant master equation written for the probability of finding the system state I at any time t, is shown to be a Boltzmann type equation and hence has a Gibbs’ distribution as its steady-state solution.

Peretto shows that the markovian process having the above characteristics can be described by atleast a narrow class of Hamiltonians which obey the detailed balance principle. In other words, a Hamiltonian description of neural activity under markovian statistics is still feasible, though with a constraint posed by the detailed balance principle which translates to the synaptic interactions being symmetric (Jij = Jji).


Previous Table of Contents Next

Copyright © CRC Press LLC

HomeAccount InfoSubscribeLoginSearchMy ITKnowledgeFAQSitemapContact Us
Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home

Use of this site is subject to certain Terms & Conditions, Copyright © 1996-2000 EarthWeb Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Read EarthWeb's privacy statement.