One of the criticisms of the previous methods is that they implicitly assume a particular response form of variables to any underlying gradients. This response form is a function of the multivariate distance preserved by each - Euclidean distance to reflect linear responses

(PCA) or distance to represent unimodal responses (CA). An alternative group of ordinations are known as scaling methods. In contrast to PCA and CA, scaling methods begin with a multivariate distance matrix selected by the analyst. This consequently enables a range of ordinations that preserve a chosen distance.

Principal coordinate (PCO) analysis, also known as metric or classical scaling, is an eigenanalysis approach to scaling. The analysis begins with a distance matrix D, the elements of which are transformed by - 0.5D2, then centered across the distance matrix. The resulting matrix is then subjected to eigenanalysis, the eigenvalues and eigenvectors extracted, and the eigenvectors scaled to the square root of the eigenvalue. This process is very similar to the calculation of PCA; however, in PCA, the variables are multiplied by the eigenvectors to project the samples into the ordination axes. In PCO, the eigenvectors themselves are the sample positions on the axes. The original variables do not feature in the ordination process. As with PCA, the eigenvalues provide a measure of variance explained by the ordination. The similarity with PCA is even more apparent because a PCO on a Euclidean distance matrix of the same data/transformation will generate exactly the same ordination as a PCA on the covariance matrix. One problem that may occur with PCO is that ifthe distance measure is non-Euclidean (i.e., cannot be fully represented in Euclidean space), then the distance matrix itself may not be fully presented in the ordination space. This will be indicated by negative eigenvalues. This problem can be resolved by transforming distances (e.g., square root), adding a constant to the off-diagonals of the distance matrix, or adding a transformed function of each distance to itself. As with PCA and CA, deciding how may ordination axes to plot is usually based on a scree plot of the eigenvalues.

An alternative scaling approach is multidimensional scaling (MDS, although this term has been synonymized with nonmetric multidimensional scaling (nm-MDS) in particular among many ecologists). As with classical scaling, the analysis begins with a distance measure chosen by the researcher. However, instead of sequentially deriving orthogonal axes using eigenanalysis, MDS generates an ordination of a specified dimensionality. Generating an ordination of a given dimensionality is an iterative procedure. Typically, a random configuration of the samples in (say) a two-dimensional ordination space will act as a starting point. The differences between the observed distance and the first difference of the iteration are calculated. These differences are regressed on each other using one of a range of methods, depending on whether a metric or nonmetric MDS is required. Metric MDS will fit a parametric relationship specified by the user, such as linear or polynomial regression, with or without transformation of the distance, while nm-MDS will fit a monotone regression on the ranked differences. The goodness of fit of the regression is calculated using the differences between observed and predicted values, combined in one of three common ways: Stress 1 (^/sum(differences)2/sum(observed distance)2); Stress 2 ( sum(differences)2/sum(observed distance - mean observed distance)2); SStress (^/sum(observed distance2 -fitted distance2)2). The ordination configuration is shifted in the next iteration to the configuration that is likely to decrease stress using a numerical optimization procedure called the method of steepest descent, the stress is recalculated, compared with the previous iteration, and the process repeated until the difference between iterations reaches some small tolerance value. The final sample configuration is the ordination, and the stress values provide a measure of goodness of fit, with smaller values indicating better goodness of fit. This procedure is run for a range of dimensional solutions (e.g., two, three, four, five, and so on) to identify the dimensionality of the data, and the solution with the lowest change in stress between iterations chosen. This is similar to examining changes in eigenvalues in an eigenanalysis approach. There are no sound theoretical reasons for deciding what a 'good' stress value is, but a value less than 0.1 is usually desirable. However, extremely small stress values (e.g., 0.001) should be treated with caution as they usually indicate a degenerate solution.

An nm-MDS of the Euclidean distance of the %2-transformed triplefin data set identifies the same patterns as those identified by PCA and CA (Figure 2c). This similarity with the other ordination methods illustrates the importance of data transformation and multivariate distance implicitly preserved in the method, rather than the calculation method itself. In MDS, the original variables formed no part of the ordination analysis, so there is no formal reason to display their values on a plot (unlike PCA or CA). However, it is common practice to either regress or correlate the original values with the sample scores, and project these coefficients onto a plot (Figure 2f).

MDS has some unique features that should be treated with caution by users. MDS can get trapped into local ordination solutions, depending on the starting configuration. For example, there may be a number of two-dimensional solutions that locally fit (i.e., small changes from their position will increase the stress), but the one that best fits may not be identified during any particular analytical run. It is recommended that a number of different random starting points be used, or if computational power is limiting, the results from a PCO analysis of the same distance matrix could be used as a starting configuration. The ordination axes of MDS are entirely arbitrary and mean nothing other than providing a convenient reference for defining an ordination space; however, many software implementations rotate the axes using PCA so that the first axis usually maximally separates the samples. Unlike eigenanalysis methods, which progressively subdivide variation so that the position on the first axis is fixed regardless of the number of axes examined, the position ofsamples on MDS axes may change when different dimensionalities are extracted.

Although the implicit distance measures used by PCA and CA have formed the basis of much of their criticism, it should be noted that the behavior of a metric ordination is largely a function of data transformation and scaling, and many published comparisons have used explicitly untransformed data for PCA/CA in comparison with implicitly transformed/reweighted data in scaling methods. It should also be noted that suitable rescalings and transformations, followed by PCA (preserving Euclidean distance), can be used to preserve simple matching, chord, X2, Hellinger, Mahalanobis, and species profile distances -all of which are commonly used in ecology - in a PCA biplot. The behavior of metric ordinations can also be controlled by various transformations to reduce effects of dominant values, standardizations (by variable and/or sample standard deviations, means, totals, or ranges), and for many ecological data sets metric ordination can generate similar results to scaling methods, with the additional advantage of explicit overlaying of the variables in the reduced space plot. Scaling methods are appropriate when the desired distance to be preserved is not Euclidean, such as the Bray-Curtis distance (although the Hellinger distance appears to behave like the Bray-Curtis index in metric ordinations), and when there are more variables than there are samples. An additional caution is that while methods such as PCA and CA assume a certain underlying response model, the response forms that MDS can detect are unknown. Whether this is a problem depends on the question and intent of the researcher. But clearly, deciding the 'best' ordination requires understanding the effects of transformation, standardization, distance preserved, and underlying response structure of the data rather than selecting an ordination by name. This is illustrated by contrasting a PCA of %2-transformed data, with a CA of the raw data, with an nm-MDS of Euclidean distance of X2-transformed data. All reduced-space plots are very similar (Figure 2). This is because the data transformation and distance preserved in the ordination are of central importance, and the algorithms used are simply tools to achieve an ecological objective.

Was this article helpful?

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

## Post a comment