Jordan Network

Jordan proposed a partially recurrent network, by adding recurrent links from the network's output to a set of context units in a context layer, and from the context units to themselves. The Jordan network learning procedure includes the following steps: (1) The output of each state is fed back to the context units and mixed with the input representing the next state for the input nodes (Figure 3). (2) This input-output combination constitutes the new network state for processing at the next time step. And (3) after several steps, the patterns present in the context units together with input units are characteristics of the particular sequence of the states. The self-connections in the context layer therefore give the context units Q themselves.

In discrete time, the context units Q are updated according to eqn [16]

where y is the activation of the output nodes and a (0 < a < 1) is the strength of the self-connections

When the context units are considered as inputs, the Jordan network can be trained with the conventional back-propagation algorithm (see Multilayer Perceptron).

Output layer

Hidden layer

Input layer

Output layer

Hidden layer

Input layer

Context layer

Figure 3 Diagram of Jordan network.

Context layer

Was this article helpful?

0 0
Project Earth Conservation

Project Earth Conservation

Get All The Support And Guidance You Need To Be A Success At Helping Save The Earth. This Book Is One Of The Most Valuable Resources In The World When It Comes To How To Recycle to Create a Better Future for Our Children.

Get My Free Ebook


Post a comment