Bayesian Models

While classical statistical modeling depends on careful formulation of the 'likelihood function' for the data, given the predictor variables and model parameters (which, to this point, the author has referred to as the 'error distribution'), Bayesian modeling adds a component called the 'prior distribution' on parameters. The prior allows a way to incorporate knowledge that existed about the parameters prior to analyzing the data set at hand. This is done through the application of Bayes' theorem to combine the prior beliefs with the likelihood of the data to yield the posterior distribution, which summarizes the knowledge that exists after accounting for the new data:

On the left-hand side is the posterior (the distribution of the parameters 0 given the data x andy). On the right-hand side, the numerator is the product of the likelihood (the probability of the response given the predictors and parameters) and the prior distribution on the parameters. The denominator is the marginal probability of observing the data, which is calculated by integration and is a constant with respect to 0. Therefore, Bayes' theorem can be stated in proportional form as posterior oc likelihood x prior

Thus, a prior distribution on the model parameters is required for every Bayesian analysis. The ability to express such prior beliefs about parameters is an extension of the meaning of probability theory in Bayesian, relative to classical statistics. Probability is no longer only used to describe random events that can be sampled empirically but is also now understood to represent uncertainty or partial knowledge of a quantity that may be deterministic. For example, the parameters of a model, whether believed to be stochastic or deterministic, can, in the Bayesian framework, be represented by probability distributions indicating the analyst's degree of belief in the various possible values. Thus, Bayes' theorem can be viewed as a way to update one's prior beliefs to arrive at posterior beliefs that are logical given the data. The posterior beliefs are in the form of probability distributions on model parameters that can be used analogously to the classical case to generate a distribution of model predictions. However, unlike for classical regression, the distribution is not based on simplifying assumptions and does not have to be of any particular form.

When one can bring prior information into statistical analysis, a whole new world of modeling opens up. The need for simple models with parameters that are statistically identifiable from a given data set is no longer a restriction if there is strong prior evidence for behavior that is not explicitly contained in the data. This evidence is expressed as prior distributions on parameters and may come from studies at other locations or times, or simply from informal experience. The subjective nature of the priors continues to be a controversial aspect of Bayesian modeling.

In the Bayesian framework, degrees of belief can also be stated on model 'structures', not only parameters. Therefore, Bayes' theorem can be used to calculate the relative support given to multiple models after obtaining a new data set. Predictions then consist of a weighted combination of the predictions of each model. This process of Bayesian model averaging acknowledges that there is no single true model of an ecological system, but rather there are several acceptable descriptions.

All of the statistical methods described in this article can be analyzed from a Bayesian perspective. In some situations, such as a linear model with little prior knowledge and an abundance of data, the classical and the Bayesian results will have only minor differences. However, in other situations, such as a hierarchical model with many cross-system data and few local data, the results, in terms of both the mean prediction and the predictive uncertainty, will be quite different. In fact, in most situations, the Bayesian approach suggests a whole new way of thinking about modeling. One begins to recognize that there is uncertainty and stochasticity at multiple levels of any problem. It is not all just 'measurement error' in y, as classical regression usually assumes. Uncertainty is also usually present in the values of the predictor variables x, for example, as well as in the applicability of some model parameters to certain subsets of the data. We also see that we can bring a wide variety of information to bear on addressing such issues, not only data from the study at hand. In many ways, this is more consistent with the process of scientific learning -each new study builds upon the results of previous studies - and Bayesian modeling provides a formal framework for how this learning can occur.

Difficulties with Bayesian modeling consist mostly of issues of computational complexity. However, accurate and efficient algorithms continue to be developed by the statistical research community and are being incorporated into user-friendly software packages. Thus, we should not expect computational problems to be a limiting factor in the use of Bayesian models for long.

Worm Farming

Worm Farming

Do You Want To Learn More About Green Living That Can Save You Money? Discover How To Create A Worm Farm From Scratch! Recycling has caught on with a more people as the years go by. Well, now theres another way to recycle that may seem unconventional at first, but it can save you money down the road.

Get My Free Ebook


Post a comment