In introducing Bayesian methods, this chapter made two important points. Firstly, Bayesian methods can answer questions that are relevant to ecologists, such as: 'What is the probability that this hypothesis is true?' and 'What is the probability that a parameter will take values within a specified interval?' Secondly, relevant prior information can also be incorporated into Bayesian analyses to improve the precision of estimates.
Bayes' rule is the basis of Bayesian methods. It is derived as a simple expression of conditional probability. The rule specifies how prior information and data are combined using a model to arrive at the posterior state of knowledge. Both the prior and posterior states of knowledge are represented as probability distributions. The posterior probability simply equals the prior probability multiplied by the likelihood of the data and a scaling constant. Bayesian methods become difficult because the scaling constant is usually hard to calculate analytically. However, recent numerical methods such as Markov chain Monte Carlo make Bayesian methods accessible to all scientists.
Frequentist confidence intervals and Bayesian credible intervals will usually be numerically equivalent if uninformative priors are used. In this way Bayesian methods provide a numerical generalization of frequentist methods. They also do so in such a way that probabilistic statements about the state of nature are mathematically logical. The next chapter provides a more thorough comparison of different statistical schools and examines their various strengths and weaknesses.
Was this article helpful?