The MLP architecture is a layered feed-forward neural network, in which the nonlinear elements (neurons) are arranged in successive layers, and the information flows unidirectionally, from input layer to output layer, through the hidden layer(s) (Figure 2). Nodes from one layer are connected (using interconnections or links) to all nodes in the adjacent layer(s), but no lateral connection between nodes within one layer or feedback connection are possible. The number of input and output units depends on the representations of the input and the output objects, respectively. The hidden layer(s) is(are) an important parameter(s) in the network. The MLP with an arbitrary number of hidden units have been shown to be universal approximators for continuous maps to implement any function.
Was this article helpful?
Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.