# Matrix

A matrix is a homogenous two-dimensional array composed of numbers.

**AKA:**Arbitrarily Shaped Matrix.**Context:**- It can be of [math]\displaystyle{ m }[/math]x[math]\displaystyle{ n }[/math] dimension and denoted as [math]\displaystyle{ A=[a_{ij}]_{\substack{ i=1\dots m\\j=1\dots n}}=A_{mn} = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{pmatrix} }[/math]
- It can (typically) have [math]\displaystyle{ n\gt 0 }[/math] Matrix Columns (matrix dimensionality).
- It can (typically) have [math]\displaystyle{ m\gt 0 }[/math] Matrix Rows.
- It can range from being an Abstract Matrix to being a Matrix Data Structure.
- It can range from being a Rectangular Matrix to being a Square Matrix (such as a skew-symmetric matrix).
- It can range from being a 2-D Matrix, to being a 3-D Matrix to being an …
- It can range from being a Boolean Matrix to being a Numeric Matrix (such as an integer matrix or a complex matrix).
- It can range from being a Dense Matrix to being a Sparse Matrix.
- It can range from being an Orthogonal Matrix to being a Unitary Matrix.
- It can be an input to a Matrix Operation, such as matrix multiplication and matrix factorization.
- It can have an Eigenvalue.
- It can be a Matrix Mapping.

**Example(s):**- a 2x3 Matrix [math]\displaystyle{ \begin{bmatrix}1 & 9 & 13 \\20 & 55 & 6 \end{bmatrix} }[/math] or a 3x2 Matrix.
- an Identity Matrix [math]\displaystyle{ \begin{bmatrix}1 & 0 \\0 & 1 \end{bmatrix}. }[/math]
- a Vector of Vectors (of equal sized Numeric Vectors).
- a Triangular Matrix.
- a mapping [math]\displaystyle{ T:A \to B }[/math] can be a matrix.
Here [math]\displaystyle{ T }[/math] is a matrix which maps every vector (can be seen as a input column matrix) [math]\displaystyle{ a\in A }[/math] to every vector (can be seen as the respective output column matrix of input [math]\displaystyle{ a }[/math]) [math]\displaystyle{ b\in B }[/math].

If [math]\displaystyle{ T=\begin{bmatrix}0 & -1 \\1 & 0 \end{bmatrix} }[/math]and [math]\displaystyle{ a=\begin{bmatrix}1 \\1 \end{bmatrix} }[/math] then the output [math]\displaystyle{ b=\begin{bmatrix}-1 \\1 \end{bmatrix} }[/math]

[math]\displaystyle{ Ta=b: \begin{bmatrix}0 & -1 \\1 & 0 \end{bmatrix}\begin{bmatrix}1 \\1 \end{bmatrix}=\begin{bmatrix}-1 \\1 \end{bmatrix} }[/math]

Here it is important to note that the matrix [math]\displaystyle{ T }[/math] transforms every input matrix [math]\displaystyle{ a }[/math] to its respective orthogonal matrix [math]\displaystyle{ b }[/math] which is the output. So [math]\displaystyle{ T }[/math] (which itself is an orthogonal matrix) here called an orthogonal transformation or an orthogonal mapping.

**Counter-Example(s):**- an Second Rank Tensor.
- a Tuple Set.
- a List.

**See:**Linear Algebra, Eigenvector.

## References

### 2015

- http://en.wikipedia.org/wiki/List_of_matrices
- A
**matrix**(plural matrices, or less commonly matrixes) is a rectangular array of numbers called*entries*. Matrices have a long history of both study and application, leading to diverse ways of classifying matrices. A first group is matrices satisfying concrete conditions of the entries, including constant matrices.

- A

- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/comparison_of_linear_algebra_libraries Retrieved:2015-2-1.
- The following tables provide a comparison of
**linear algebra software libraries**, either specialized or general purpose libraries with significant linear algebra coverage. ... - … Matrix types (special types like bidiagonal/tridiagonal are not listed):
*Real*- general (nonsymmetric) real.*Complex*- general (nonsymmetric) complex.*SPD*- symmetric positive definite (real).*HPD*- Hermitian positive definite (complex).*SY*- symmetric (real).*HE*- Hermitian (complex).*BND*- band

- The following tables provide a comparison of

### 2014

- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Matrix_(mathematics) Retrieved:2014-10-4.
- In mathematics, a
**matrix**(plural matrices) is a rectangular*array*^{[1]}of numbers, symbols, or expressions, arranged in*rows*and*columns*. The individual items in a matrix are called its*elements*or*entries*. An example of a matrix with 2 rows and 3 columns is :[math]\displaystyle{ \begin{bmatrix}1 & 9 & -13 \\20 & 5 & -6 \end{bmatrix}. }[/math] Matrices of the same size can be added or subtracted element by element. The rule for matrix multiplication, however, is that two matrices can be multiplied only when the number of columns in the first equals the number of rows in the second. A major application of matrices is to represent linear transformations, that is, generalizations of linear functions such as . For example, the rotation of vectors in three dimensional space is a linear transformation which can be represented by a rotation matrix**R**. If**v**is a column vector (a matrix with only one column) describing the position of a point in space, the product**Rv****is a column vector describing the position of that point after a rotation. The product of two matrices is a matrix that represents the composition of two linear transformations. Another application of matrices is in the solution of a system of linear equations. If the matrix is square, it is possible to deduce some of its properties by computing its determinant. For example, a square matrix has an inverse if and only if its determinant is not zero. Eigenvalues and eigenvectors provide insight into the geometry of linear transformations. Applications of matrices are found in most scientific fields. In every branch of physics, including classical mechanics, optics, electromagnetism, quantum mechanics, and quantum electrodynamics, they are used to study physical phenomena, such as the motion of rigid bodies. In computer graphics, they are used to project a 3-dimensional image onto a 2-dimensional screen. In probability theory and statistics, stochastic matrices are used to describe sets of probabilities; for instance, they are used within the PageRank algorithm that ranks the pages in a Google search.**^{[2]}Matrix calculus generalizes classical analytical notions such as derivatives and exponentials to higher dimensions.**A major branch of numerical analysis is devoted to the development of efficient algorithms for matrix computations, a subject that is centuries old and is today an expanding area of research. Matrix decomposition methods simplify computations, both theoretically and practically. Algorithms that are tailored to particular matrix structures, such as sparse matrices and near-diagonal matrices, expedite computations in finite element method and other computations. Infinite matrices occur in planetary theory and in atomic theory. A simple example of an infinite matrix is the matrix representing the derivative operator, which acts on the Taylor series of a function.**

- In mathematics, a

### 2011

- (Wikipedia, 2011) ⇒ http://en.wikipedia.org/wiki/Matrix_(mathematics)
- … A
*matrix*is a rectangular arrangement of mathematical expressions that can be simply numbers.

- … A