Boltzmann learning underlies an artificial neural network (ANN) model - the Boltzmann machine - based largely on the Hopfield network model (see Hopfield Network) with numerous improvements. An understanding of the principles of neural networks in general (see Multilayer Perceptron), and Hopfield networks in particular, is highly recommended to better understand this article. Boltzmann learning uses stochastic binary units rather than the deterministic binary threshold units in a Hopfield model. It also allows for the existence of hidden units, which allow these networks to model data distributions much more complex than those which can be learned by Hopfield nets. Furthermore, the use of simulated annealing (see Simulated Annealing) in the training algorithm optimizes the chances that the network can escape from local minima and find the best distribution of weights.
Was this article helpful?