The main steps in fitting statistical models to data are to derive the likelihood function for the statistical model, estimate the most likely parameters given the observed

Time (days)

Maximum per capita growth rate (r)

Maximum per capita growth rate (r)

Maximum per capita growth rate (r)

Figure 7 Confidence intervals from parametric bootstrapping. (a) Simulated bootstrap data (solid points) generated from the fit statistical model (solid line). The observed algal dynamics are shown with open circles. (b) The likelihood surface of Figure 6 showing the 95% confidence region from the likelihood ratio distribution (gray line). The black dots are the most likely parameter estimates fit to each of the simulated bootstrap data. (c) Sampling distribution for the maximum per capita growth rate parameter (r) from 5000 bootstrap replicates. The 95% bootstrap confidence intervals are obtained from the sampling distribution (dashed lines).

data, and then generate confidence intervals. However, a number of problems can emerge that make the process more challenging. In particular, challenges arise when the likelihood function is difficult to derive, when the estimators are biased, and when the parameter estimates are either nonidentifiable or nonestimable.

Nonlikelihood estimators. The method of maximum likelihood is the most popular method for estimating parameters, but it is not the only one available. Likelihood functions require that the probability distributions for the stochastic component be known. In situations where the distribution is unknown, but the relationship between the mean and the variance is known, quasi-likelihood methods can be used to estimate the most likely parameter estiamtes. Alternatively, parameter estimates can be obtained using the method of estimating functions.

Biased estimators. Bias refers to the difference between the expectation of a distribution of maximum likelihood parameter estimates and the true parameter value. If the difference is zero, then the estimator used to find the estimates is unbiased. Bias is straightforward to assess by simulating a large number of data sets from the statistical model (with the same structure as the raw data in terms of both the number and spacing of observations), and estimating the most likely parameter values for each simulation. Estimator bias is found by comparing the mean of the parameter estimates with the known parameter values used to simulate the data.

Identifiability and estimability. Estimability refers to whether or not the data are sufficient to estimate the most likely parameter values. For example, if the algal dynamics of Figure 1 had only three data points, then the parameters r, K, and —2 would be nonestimable because different combinations of parameter estimates would yield very similar likelihood values. Nonestimability is primarily a data problem and can be identified by unrealistically large confidence intervals. Identifiability, on the other hand, is an ill-posed property of the statistical model. Essentially, different combinations of parameter values have the same likelihood value regardless of how much data are available. Ecologists may encounter this problem in models with a large number of biologically relevant parameters because some parameters can have an equivalent effect in the statistical model. Determining whether parameters are identifiable can be difficult, but the recent method of data cloning provides a novel solution. Through a Bayesian framework, data cloning uses the observed data to update the prior distribution of the parameters. By casting the posterior distribution as a new prior, and repeatedly fitting the model to the observed data, the method can be used to estimate the maximum likelihood parameter values, and provides a formal test for nonidentifiability. Solutions to nonidentifiability are to either collect a different kind of data that can distinguish the parameter estimates, such as hierarchal data, or develop an alternative statistical model where the parameters are identifiable.

Was this article helpful?

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

## Post a comment