EarthWeb   
HomeAccount InfoLoginSearchMy ITKnowledgeFAQSitemapContact Us
     

   
  All ITKnowledge
  Source Code

  Search Tips
  Advanced Search
   
  

  

[an error occurred while processing this directive]
Previous Table of Contents Next


Convergence of the system to a stable state refers to E reaching its global minimum. This is feasible in the absence of the stochastical variable ηi (caused by the cellular disturbance/noise). However, the finiteness of ηi and the resulting strength of randomness could unstabilize the march of the network towards the global minimum in respect to any optimization search procedure. For example, a set of variables that can take only the two dichotomous limits (0 and 1) may represent possible solutions to discrete optimization problems. For each variable, a neuron can be assigned, with the optimization criteria specified by the energy function (E) of Equation (6.18). From this energy function, the coupling weights Wij and the external input Si can be decided or derived deterministically, in the absence of the disturbance/noise function ηi. That is, starting from an arbitrary initial state and with an appropriate scaling factor Λ assigned to the nonlinear function F, the neuron achieves a final stable state 0 or 1. Hence, a high output of a neuron (i, j), corresponding to an output close to its maximum value of 1, refers to an optimization problem similar to the considerations in assigning a closed tour for a traveling salesman over a set of N cities with the length of the tour minimized subject to the constraints that no city should be omitted or visited twice.

In the presence of η(t), however, the aforesaid network modeling may become unstable; or the evolution of the energy-function decreasing monotonically and converging to a minimum would be jeopardized, as could be evinced from the following Hopfield energy functional analysis:

The evolution of E with progress of time in the presence of η(t) having a dynamic state of Equation (6.16) and an input-output relation specified by σi = F(Si) can be written as:

Using Equation (6.18),

The above equation permits Et(t) to decrease monotonically (that is, Et ≤ 0) and converge to a minimum only in the absence of η(t). When such a convergence occurs for an increasing scaling factor of Λ (ultimately reaching infinity at the McCulloch-Pitts limit), ∫ F-1(σ) dσ would approach zero in any interval; and (E - Et) will therefore, become negligible for σi specified in that interval. That is, the minimum of E would remain close to that of Et; but this separation would widen as the strength of η increases.

Failure to reach a global minimum in an optimization problem would suboptimize the solution search; and hence the corresponding computational time will increase considerably. Two methods of obviating the effect(s) of disturbances in the neural network have been suggested by Bulsara et al. [90]. With a certain critical value of the nonlinearity, the system can be forced into a double-well potential to find one or the other stable state. Alternatively, by careful choice of the input (constant) bias term θi, the system can be driven to find a global minimum more rapidly.

In the neural system discussed, it is imperative that the total energy of the system is at its minimum (Lyapunov’s condition) if the variable W reaches a stable equilibrium value. However, the presence of η(t) will offset this condition, and the corresponding wandering of W in the phase-plane can be traced by a phase trajectory. Such a (random) status of W and the resulting system instability is, therefore, specified implicitly by the joint event of instability pertaining to the firings of (presynaptic and/or postsynaptic) neurons due to the existence of synaptic connectivity.

Hence, in the presence of noise/disturbance, the random variate W(1) can be specified in terms of its value under noiseless conditions, namely, W(2) by a linear temporal gradient relation over the low-pass action as follows:

where WR is the root-mean-squared value of W, namely, (<W2>)1/2. By virtue of Equations (6.16-6.1,9, 6.21), the differential time derivative of the Lyapunov energy function Et under noisy conditions can be written (under the assumption that the ∂WR/∂t is invariant over the low-pass action) as:

Hence, it is evident that as long as the temporal gradient of WR in Equation (6.22) or the strength of noise η in Equation (6.20) is finite, the system will not reach per se the global minimum and hence the stable state.


Previous Table of Contents Next

Copyright © CRC Press LLC

HomeAccount InfoSubscribeLoginSearchMy ITKnowledgeFAQSitemapContact Us
Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home

Use of this site is subject to certain Terms & Conditions, Copyright © 1996-2000 EarthWeb Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Read EarthWeb's privacy statement.