Real Time Recurrent Network

The real-time recurrent network (RTRN) is characterized as containing hidden neurons and allowing arbitrary dynamics with a fully connected network structure. The RTRN is especially capable of dealing with time-varying input or output through its own temporal operation and has been applied to speech recognition.

The RTRN has M external inputs, N concatenated nodes, and K outputs. Figure 5 shows the schematic diagram of RTRN. An external input vector of size M is applied to the network at a discrete time t. Let y(t) denote the corresponding vector of size N of individual neuron outputs produced one step later at time t. The input vector and the one-step delayed output vector are concatenated to form vectors of size (M + N). In total, an N by (M + N) recurrent weight matrix is formed.

The net internal activity of neuron j at time t is as follows:

where v(t) is x(t) ifj denotes the external input, and y(t — 1) if j denotes the neuron for outputs. The term wj{t) indicates the weight between the input and the hidden layers. At the next time step (t + 1), the output of neuron j is computed by passing v(t) through the nonlinearity (e.g., logistic function), resulting in the following:

The real-time recurrent learning handles weight feedback in the real-time process and allows faster convergence in recurrent learning. The detailed algorithm could be referred to Williams and Zisper.

Figure 5 A diagram of an RTRN.
Oplan Termites

Oplan Termites

You Might Start Missing Your Termites After Kickin'em Out. After All, They Have Been Your Roommates For Quite A While. Enraged With How The Termites Have Eaten Up Your Antique Furniture? Can't Wait To Have Them Exterminated Completely From The Face Of The Earth? Fret Not. We Will Tell You How To Get Rid Of Them From Your House At Least. If Not From The Face The Earth.

Get My Free Ebook


Post a comment