A Hopfield network operates on data sets consisting of binary vectors. The network contains exactly one unit per bit in the vectors. Each neuron, or unit, can take on only binary values 0 and 1. The network can also be equivalently, and more simply, defined with units taking on values +1 and — 1. This requires that a data set be nominally translated into an appropriate set ofbinary vectors. These vectors could represent whether any particular datum falls within one of a discrete set ofcategories. For example, each bit in the vector could represent the occurrence of specific property in an observation. Another representation would be to turn only the kth bit on, that is, the bit assumes the value of 1 and all other bits are 0, if the observation falls into the kth category.
The strengths ofthe connections (known as the weights) between any two units define how much positive or negative influence the activity ofone unit has upon the activity of another unit. The memory storage algorithm sets the weights of the network based on the set of n data vectors:
where dk is the ith coefficient of the kth data vector, when the vectors take on values 0 and 1. If the vectors take on values of+1 and —1, then the equivalent memory-storage formula is much simpler:
0, if Y1 WjiSj < 8j j=i or, for units which take on values of +1 or —1:
where si is the activity of the ith unit, w^ is the weight from unit j to unit i, and Qi is the bias (or threshold) on unit i (commonly set 0 for all the units). The updates of the units occur asynchronously, and independently.
Given a set of weights, each state of the network can be assigned a global 'energy', defined as
For a given state of the network, the effect of the kth unit on the global energy can be measured by taking the difference in energy when this unit is false to when it is true:
w ij i1
Therefore, by updating the states of the units according to the binary threshold rule, the energy of the network will decrease monotonically until it reaches a local minimum, where no further updates will occur. This convergence is only guaranteed, however, when the weights of the network are symmetric, that is,
Networks without this constraint may oscillate or move chaotically through state space. Experiments performed by Hopfield seemed to indicate that the extent of these perturbations was confined to a relatively small area around the local minima, and so did not seriously impede the network's ability to act as a content-addressable memory system.
Was this article helpful?