Forwardpropagating step

Figure 2 shows a general appearance of a neuron with its connections. Each connection from ith to the jth neuron is associated with a quantity called weight or connection strength (wj). A net input (called activation) for each neuron is the sum of all its input values multiplied by their corresponding connection weights, expressed as ai = xiwji+° i where i the total of neurons in the previous layer and Oj is a bias term which influences the horizontal offset of the function (fixed value of 1). Once the activation of a neuron is calculated, we can determine the output value (i.e., the response) by applying a transfer function:

Many transfer functions may be used, for example, a linear function, a threshold function, a sigmoid function, etc. (Figure 3). A sigmoid function is often used, because it has nonlinearity, which is given by

The weights play an important role in the propagation of the signal in the network. They establish a link between input pattern and its associated output pattern, that is, they contain the knowledge of the neural network about the problem-solution relation.

The forward-propagation step begins with the presentation of an input pattern to the input layer, and continues as activation-level calculations propagate forward till the output layer through the hidden layer(s). In each

Figure 2 Basic processing element (neuron) in a network. Each input connection value (x,) is associated with a weight (wj). The output value (X,- = f(aj)) can fan out to another unit.

Was this article helpful?

0 0
Project Earth Conservation

Project Earth Conservation

Get All The Support And Guidance You Need To Be A Success At Helping Save The Earth. This Book Is One Of The Most Valuable Resources In The World When It Comes To How To Recycle to Create a Better Future for Our Children.

Get My Free Ebook


Post a comment