# Multidimensional Scaling Algorithm

(Redirected from multidimensional scaling)

A Multidimensional Scaling Algorithm is an ordination algorithm that produces a [math]k[/math]-dimensional, [math]k \le p[/math], representation of the items such that the distances among the points in the new space reflect the proximities in the data

**AKA:**Multidimensional Scaling, MDS.**Context:**- It can range from being a Metric Multidimensional Scaling to being a Non-Metric Multidimensional Scaling.

**Counter-Example(s):****See:**Torgerson Scaling, Generalized Multidimensional Scaling, Distance Measure, Affinity Matrix, Non-Linear Dimensionality Reduction Algorithm.

## References

### 2014

- http://scikit-learn.org/stable/modules/manifold.html#isomap
- QUOTE: One of the earliest approaches to manifold learning is the Isomap algorithm, short for Isometric Mapping. Isomap can be viewed as an extension of Multi-dimensional Scaling (MDS) or Kernel PCA. Isomap seeks a lower-dimensional embedding which maintains geodesic distances between all points. Isomap can be performed with the object Isomap.

### 2011

- http://en.wikipedia.org/wiki/Multidimensional_scaling
**Multidimensional scaling**(MDS) is a set of related statistical techniques often used in information visualization for exploring similarities or dissimilarities in data. MDS is a special case of ordination. An MDS algorithm starts with a matrix of item–item similarities, then assigns a location to each item in*N*-dimensional space, where*N*is specified a priori. For sufficiently small*N*, the resulting locations may be displayed in a graph or 3D visualisation.

MDS algorithms fall into a taxonomy, depending on the meaning of the input matrix:- Classical multidimensional scaling: also known as Torgerson Scaling or Torgerson–Gower scaling – takes an input matrix giving dissimilarities between pairs of items and outputs a coordinate matrix whose configuration minimizes a loss function called
*strain*. - Metric multidimensional scaling: A superset of classical MDS that generalizes the optimization procedure to a variety of loss functions and input matrices of known distances with weights and so on. A useful loss function in this context is called
*stress*which is often minimized using a procedure called stress majorization. - Non-metric multidimensional scaling: In contrast to metric MDS, non-metric MDS finds both a non-parametric monotonic relationship between the dissimilarities in the item-item matrix and the Euclidean distances between items, and the location of each item in the low-dimensional space. The relationship is typically found using isotonic regression.

Louis Guttman's**smallest space analysis**(SSA) is an example of a non-metric MDS procedure. - Generalized multidimensional scaling: An extension of metric multidimensional scaling, in which the target space is an arbitrary smooth non-Euclidean space. In case when the dissimilarities are distances on a surface and the target space is another surface, GMDS allows finding the minimum-distortion embedding of one surface into another.
^{[1]}

- Classical multidimensional scaling: also known as Torgerson Scaling or Torgerson–Gower scaling – takes an input matrix giving dissimilarities between pairs of items and outputs a coordinate matrix whose configuration minimizes a loss function called

- ↑ Bronstein AM, Bronstein MM, Kimmel R (January 2006). "Generalized multidimensional scaling: a framework for isometry-invariant partial surface matching".
*Proc. Natl. Acad. Sci. U.S.A.***103**(5): 1168–72. doi:10.1073/pnas.0508601103. PMC 1360551. PMID 16432211. http://www.pnas.org/cgi/pmidlookup?view=long&pmid=16432211.

### 2002

- (Fodor, 2002) ⇒ Imola K. Fodor. (2002). “A Survey of Dimension Reduction Techniques." LLNL technical report, UCRL ID-148494
- QUOTE: Given [math]n[/math] items in a [math]p[/math]-dimensional space and an [math]n \times n[/math] matrix of proximity measures among the items, multidimensional scaling (MDS) produces a [math]k[/math]-dimensional, [math]k \le p[/math], representation of the items such that the distances among the points in the new space reflect the proximities in the data [8, 7, 41]. The proximity measures the (dis)similarities among the items, and in general, it is a distance measure: the more similar two items are, the smaller their distance is. Popular distance measures include the Euclidean distance (L2 norm), the manhattan distance (L1, absolute norm), and the maximum norm. Results of MDS are indeterminate with respect to translation, rotation, and reflection.

MDS has been typically used to transform the data into two or three dimensions, and visualizing the result to uncover hidden structure in the data.

- QUOTE: Given [math]n[/math] items in a [math]p[/math]-dimensional space and an [math]n \times n[/math] matrix of proximity measures among the items, multidimensional scaling (MDS) produces a [math]k[/math]-dimensional, [math]k \le p[/math], representation of the items such that the distances among the points in the new space reflect the proximities in the data [8, 7, 41]. The proximity measures the (dis)similarities among the items, and in general, it is a distance measure: the more similar two items are, the smaller their distance is. Popular distance measures include the Euclidean distance (L2 norm), the manhattan distance (L1, absolute norm), and the maximum norm. Results of MDS are indeterminate with respect to translation, rotation, and reflection.

### 2001

- (Cox & Cox, 2001) ⇒ Trevor F. Cox, and Michael A. A. Cox. "Multidimensional Scaling, 2nd Edition." Chapman and Hall. ISBN:1584880945

### 1997

- (Borg & Groenen, 1997) ⇒ Ingwer Borg, and Patrick J. F. Groenen. (1997). “Modern Multidimensional Scaling: Theory and applications." Springer. ISBN:0387948457

### 1978

- (Kruskal & Wish, 1978) ⇒ Joseph B. Kruskal, and Myron Wish. (1978). “Multidimensional Scaling: Quantitative applications in the social sciences." SAGE. ISBN:0803909403