Motivation

Rather than designing computational systems in the traditional bottom-up and top-down manner of connecting simple circuits into larger and more sophisticated systems which achieve some prescribed specifications, neural networks utilize the emergent computational properties of many simple processing elements interacting collectively under the influence of global training regimes. This is in many ways similar to the dynamics of a system of particles operating under the influence ofa number ofphysical laws.

The operation of Hopfield networks is analogous to dynamical physical systems. Such systems are often described by the total energy of the various possible configurations of their parts. Lower-energy configurations are the least likely to change, and thus the most stable. As such, these systems will tend to drift toward lower-energy configurations. One can imagine the energy of the system as a landscape of peaks and valleys. The lowest points in each valley or trough will be the most stable configurations, as there is no path which can be followed to reach a lower point. In the absence of outside influences, the system will tend to these points. These minima of the energy landscape become the attractors for the dynamics of the system (although in some systems, oscillating or complex attractors can also occur). The saying ''water will always run to the lowest point'' embodies this principle, and there are many further examples in the physical sciences.

Figure 1 illustrates a simple energy landscape for two units. The x- and j-axes of the diagram represent the

Figure 1 Simple energy landscape.

states of the two units, and the z-axis represents the energy corresponding to this configuration. As such, every possible configuration has some energy, and the system will tend toward configurations with lower energies. This can be visualized as a ball rolling down the surface to find the nearest lowest point. Real systems will have many more units, and correspondingly, they will have very high-dimensional energy landscapes.

Similarly, a Hopfield network stores 'memories' by defining an energy landscape such that each of the memories is a local minima. When the resulting network is allowed to run freely, it will tend to settle in these minima, effectively retrieving the memory. Furthermore, if only a partial memory is available, the corresponding parts of the network can be 'clamped' to these values, effectively placing the network within the attractor basin of the entire memory. When the unclamped parts of the network are then allowed to run freely, the network will tend toward that attractor, thus effectively completing the memory. This property allows the network to index complete memories from any sufficiently large subset, making it a content-addressable memory storage system.

From this second form, it is easy to see the Hebbian nature of the training rule. The strength of the weights on the connection between two neurons is determined by how often (or, in how many data vectors) those neurons take on the same value. In this way, the networks are constructed according to the principles ofHebb's neurons which obey the maxim ''neurons which fire together, wire together.''

Regardless ofthe representation ofthe values the units take on, no unit ever has a connection to itself:

Project Earth Conservation

Project Earth Conservation

Get All The Support And Guidance You Need To Be A Success At Helping Save The Earth. This Book Is One Of The Most Valuable Resources In The World When It Comes To How To Recycle to Create a Better Future for Our Children.

Get My Free Ebook


Post a comment