Jordan Network

Jordan proposed a partially recurrent network, by adding recurrent links from the network's output to a set of context units in a context layer, and from the context units to themselves. The Jordan network learning procedure includes the following steps: (1) The output of each state is fed back to the context units and mixed with the input representing the next state for the input nodes (Figure 3). (2) This input-output combination constitutes the new network state for processing at the next time step. And (3) after several steps, the patterns present in the context units together with input units are characteristics of the particular sequence of the states. The self-connections in the context layer therefore give the context units Q themselves.

In discrete time, the context units Q are updated according to eqn [16]

where y is the activation of the output nodes and a (0 < a < 1) is the strength of the self-connections

When the context units are considered as inputs, the Jordan network can be trained with the conventional back-propagation algorithm (see Multilayer Perceptron).

Output layer

Hidden layer

Input layer

Output layer

Hidden layer

Input layer

Context layer

Figure 3 Diagram of Jordan network.

Context layer

10 Ways To Fight Off Cancer

10 Ways To Fight Off Cancer

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

Get My Free Ebook


Post a comment