|
 |
[an error occurred while processing this directive]
4.9 Boltzmann Machine as a Connectionist Model
Apart from the neural networks being modeled to represent the neurological fidelity, the adjunct consideration in the relevant modeling is the depiction of neural computation. Neural computational capability stems predominantly from the massive connections with variable strengths between the neural (processing) cells. That is, a neural complex or the neuromorphic network is essentially a connectionist model as defined by Feldman and Ballard [59]. It is a massively distributed parallel-processing arrangement. Its relevant attributes can be summarized by the following architectural aspects and activity-related functions:
- Dense interconnection prevails between neuronal units with variable strengths.
- Strength of interconnections specifies degree of interaction between the units.
- The state of the connectionist processing unit has dichotomous values (corresponding to firing or non-firing states of real neurons). That is, oi ∈ {0, 1}.
- The neural interaction can be inhibitory or excitatory. The algebraic sign of interconnecting weights depicts the two conditions.
- The response that a unit delegates to its neighbor can be specified by a scalar nonlinear function (F) with appropriate connection strength. That is:

where oi is the response of unit i, Wij is the strength of interconnection and N represents the set of neighbors of i.
- Every unit, operates in parallel simultaneously changing its state to the states of its neighbors. The dynamics of states lead to the units settling at a steady (nonvarying) value. The network then freezes at a global configuration.
- The units in the network cooperatively optimize the global entity of the network with the information drawn from the local environmant.
- The network information is thus distributed over the network and stored as interconnection weights.
- The Boltzmann machine (or a connectionist model) has only dichotomous states, oi ∈ {0, 1}. In contrast to this, neural modeling has also been done with continuous valued states as in the Hopfield and Tank [34] model pertinent to neural decision network.
- In the Boltzmann machine, the response function F is stochastic. (There are other models such as the perceptron model, where the response function F is regarded as deterministic.)
- The Boltzmann machine is a symmetrical network. That is, its connections are bidirectional with Wij = Wji. (Models such as a feed-forward network, however, assume only a unidirectional connections for the progression of state transitional information.)
- The Boltzmann machine is adaptable for both supervised and unsupervised training. That is, it can learn by capturing randomness in the stimuli they receive from the environment and adjust their weights accordingly (unsupervised learning), or it can also learn from a set of classification flags to learn to identify the correct output.
- The Bolzmann machine represents a model of hidden layers of units which are not visible in the participation of neural processing, and these hidden units capture the higher order disturbances in the learning process.
Construction of a Boltzmann machine relies on the following considerations as spelled out by Aarts and Korst [52]: The strength of a connection in a Boltzmann machine can be considered as a quantitative measure of the desirability that the units joined by the connection are both on. The units in a Boltzmann machine try to reach a maximal consensus about their individual states, subject to the desirabilities expressed by the connection strengths. To adjust the states of the individual units to the states of their neighbors, a probabilistic state transition mechanism is used which is governed by the simulated annealing algorithm.
|