# Linear Transformation Operation

(Redirected from Linear Transformation)

A Linear Transformation Operation is a transformation operation that preserves the operations of vector addition and scalar multiplication.

• AKA: Linear Mapping.
• Context:
• It can be represented as, for $\displaystyle{ V }$ and $\displaystyle{ W }$ be vector spaces over the field $\displaystyle{ F }$, a linear transformation $\displaystyle{ V }$ into $\displaystyle{ W }$ is a function $\displaystyle{ T }$ from $\displaystyle{ V }$ into $\displaystyle{ W }$ such that $\displaystyle{ T(c\alpha + \beta)=c(T \alpha)+T \beta }$ for all $\displaystyle{ \alpha }$ and $\displaystyle{ \beta }$ in $\displaystyle{ V }$ and all scalars $\displaystyle{ c }$ in $\displaystyle{ F }$.
• It can be a Surjective Linear Transformation.
• It can be an Injective Linear Transformation.
• It can be produced by a Linear Mapping Task.
• It can range from being a Linear Function to a Linear Matrix Transformation.
• For two linear transformations $\displaystyle{ T_1:V \to W }$ and $\displaystyle{ T_2:V \to W }$, $\displaystyle{ T_1T_2 }$ and $\displaystyle{ T_2T_1 }$ (composite transformations) are also linear transformations from $\displaystyle{ V \to W }$. But $\displaystyle{ T_1T_2 \# T_2T_1 }$.
• For a linear transformation $\displaystyle{ T:V \to W }$, the collection of all elements $\displaystyle{ w \in W }$ such that $\displaystyle{ w=T(v) }$ for $\displaystyle{ v \in V }$is called range of T and is denoted as $\displaystyle{ ran(T) }$. That is

$\displaystyle{ ran(T)=\{T(v)|v \in V\} }$.

• For a linear transformation $\displaystyle{ T:V \to W }$, the set of all elements of $\displaystyle{ V }$that mapped into zero element by the linear transformation $\displaystyle{ T }$ is called the kernel or the null-space of $\displaystyle{ T }$ and denoted as $\displaystyle{ ker(T) }$. That is

$\displaystyle{ ker(T)=\{v|T(v)=0\} }$.

• Example(s):
• a Zero Map.
• an Identity Map.
• a Linear Transformation Addition Operation ($\displaystyle{ + }$), where for $\displaystyle{ T_1:V \to W }$ and $\displaystyle{ T_2:V \to W }$, $\displaystyle{ T_1 + T_2 }$ is also a linear transformation (from $\displaystyle{ V \to W }$).
• a Constant Multiplication Function, $\displaystyle{ x \mapsto cx }$, where $\displaystyle{ c }$ is a constant.
• a Fourier Transform.
• a Haar Transform.
• a Bilinear Function.
• a Linear Projection, such as an orthogonal projection.
• $\displaystyle{ T(x)=-x/2 }$, with scale compression and scale reflection.
• a transformation $\displaystyle{ T:\R^3 \to \mathbb{R}^2 }$ defined by $\displaystyle{ T\left(\substack{ x \\y \\z}\right)=\left( \substack{ y+z \\y-z} \right) }$ is a linear transformation since $\displaystyle{ T }$ satisfies the property $\displaystyle{ T(c\alpha + \beta)=c(T \alpha)+T \beta }$ for all $\displaystyle{ \alpha, \beta \in V }$ and all scalars $\displaystyle{ c \in F }$.

Let $\displaystyle{ \alpha=\left(\substack{ x_1 \\y_1 \\z_1}\right) \in V, \beta=\left(\substack{ x_2 \\y_2 \\z_2}\right) \in V }$ and scalar $\displaystyle{ c \in F }$

.

So $\displaystyle{ T\alpha=T(\alpha)=T\left(\substack{ x_1 \\y_1 \\z_1}\right)=\left( \substack{ y_1+z_1 \\y_1-z_1} \right) \in \mathbb{R}^2=W, T \beta=T(\beta)=T\left(\substack{ x_2 \\y_2 \\z_2}\right)=\left( \substack{y_2+z_2 \\y_2-z_2} \right) \in \mathbb{R}^2=W }$

Then $\displaystyle{ T(c\alpha+\beta)=T\left(\substack{ cx_1+x_2 \\cy_1+y_2 \\cz_1+z_2}\right)=\left( \substack{ cy_1+y_2+cz_1+z_2 \\cy_1+y_2-cz_1-z_2} \right)=\left( \substack{ c(y_1+z_1)+y_2+z_2 \\c(y_1-z_1)+y_2-z_2} \right)=c\left( \substack{ y_1+z_1 \\y_1-z_1} \right)+ \left( \substack{ y_2+z_2 \\y_2-z_2} \right)=c(T\alpha) + T \beta }$, which proves that $\displaystyle{ T }$ is a linear transformation.

With a little computation it can also be found that the transformation matrix $\displaystyle{ T=\begin{pmatrix}0 & 1 & 1 \\ 0 & 1 & -1 \end{pmatrix} }$

.

It can be verified that $\displaystyle{ T\left(\substack{ x \\y \\z}\right)=\begin{pmatrix}0 & 1 & 1 \\ 0 & 1 & -1 \end{pmatrix} \left(\substack{ x \\y \\z}\right)=\left( \substack{ y+z \\y-z} \right) }$

• Counter-Example(s):
• a Non-Linear Transformation, such as $\displaystyle{ x\mapsto x^2 }$.
• Cosine Transform.
• a transformation $\displaystyle{ T:\R^3 \to \mathbb{R}^1 }$ defined by $\displaystyle{ T\begin{pmatrix}x \\y \\z \end{pmatrix}=x^2+y^2+z^2 }$ is a not linear transformation since $\displaystyle{ T }$ does not satisfy the property $\displaystyle{ T(c\alpha + \beta)=c(T \alpha)+T \beta }$ for all $\displaystyle{ \alpha, \beta \in V }$ and all scalars $\displaystyle{ c \in F }$.

The property can be verified as follows.

Let $\displaystyle{ \alpha=\begin{pmatrix}x_1 \\y_1 \\z_1 \end{pmatrix} \in \mathbb{R}^3=V, \beta=\begin{pmatrix}x_2 \\y_2 \\z_2 \end{pmatrix} \in \mathbb{R}^3=V }$ and scalar $\displaystyle{ c \in F }$

.

So $\displaystyle{ T\alpha=T(\alpha)=T\begin{pmatrix}x_1 \\y_1 \\z_1 \end{pmatrix}={x_1}^2+{y_1}^2+{z_1}^2 \in \mathbb{R}^1=W }$ and $\displaystyle{ T\beta=T(\beta)=T\begin{pmatrix}x_2 \\y_2 \\z_2 \end{pmatrix}={x_2}^2+{y_2}^2+{z_2}^2 \in \mathbb{R}^1=W }$

$\displaystyle{ T(c \alpha+\beta)=T\begin{pmatrix}cx_1+x_2 \\cy_1+y_2 \\cz_1+z_2 \end{pmatrix}={(cx_1+x_2)}^2+{(cy_1+y_2)}^2+{(cz_1+z_2)}^2 \# cT(\alpha)+T(\beta) }$.

• See: Linear Model Training, Homomorphism, Independent Component Analysis, Vector Space, Category Theory, Linear Independence, Linear Algebra Concept.

## References

### 2015

• (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Linear_map#Definition_and_first_consequences Retrieved:2015-1-30.
• Let V and W be vector spaces over the same field K. A function f: VW is said to be a linear map if for any two vectors x and y in V and any scalar α in K, the following two conditions are satisfied:
• $\displaystyle{ f(\mathbf{x}+\mathbf{y}) = f(\mathbf{x})+f(\mathbf{y}) \! }$ additivity.
• $\displaystyle{ f(\alpha \mathbf{x}) = \alpha f(\mathbf{x}) \! }$ homogeneity of degree 1
• This is equivalent to requiring the same for any linear combination of vectors, i.e. that for any vectors x1, ..., xmV and scalars a1, ..., am ∈ K, the following equality holds: : $\displaystyle{ f(a_1 \mathbf{x}_1+\cdots+a_m \mathbf{x}_m) = a_1 f(\mathbf{x}_1)+\cdots+a_m f(\mathbf{x}_m). \! }$ Denoting the zero elements of the vector spaces V and W by 0V and 0W respectively, it follows that f(0V) = 0W because letting α = 0 in the equation for homogeneity of degree 1, :$\displaystyle{ f(\mathbf{0}_{V}) = f(0 \cdot \mathbf{0}_{V}) = 0 \cdot f(\mathbf{0}_{V}) = \mathbf{0}_{W} . }$ Occasionally, V and W can be considered to be vector spaces over different fields. It is then necessary to specify which of these ground fields is being used in the definition of "linear". If V and W are considered as spaces over the field K as above, we talk about K-linear maps. For example, the conjugation of complex numbers is an R-linear map CC, but it is not C-linear.

A linear map from V to K (with K viewed as a vector space over itself) is called a linear functional.

These statements generalize to any left-module RM over a ring R without modification.

### 2012

• Mark V. Sapir. http://www.math.vanderbilt.edu/~msapir/msapir/feb19.html
• QUOTE: A function from $\displaystyle{ \R^n }$ to $\displaystyle{ \R^m }$ which takes every $\displaystyle{ n }$-vector $\displaystyle{ v }$ to the $\displaystyle{ m }$-vector $\displaystyle{ Av }$ where $\displaystyle{ A }$ is a $\displaystyle{ m }$ by $\displaystyle{ n }$ matrix, is called a linear transformation. The matrix $\displaystyle{ A }$ is called the standard matrix of this transformation. If $\displaystyle{ n=m }$ then the transformation is called a linear operator of the vector space $\displaystyle{ \R^n\lt math\gt . \lt P\gt Notice that by the definition the linear transformation with a standard matrix A takes every vector : \lt math\gt (x_1,...,x_n) }$ from \mathbb{R}^n to the vector : $\displaystyle{ (A(1,1)x_1+...+A(1,n)x_n, A(2,1)x_1+...+A(2,n)x_n,...,A(m,1)x_1+...+A(m,n)x_n) }$

from \mathbb{R}^m where A(i,j) are the entries of A. Conversely, every transformation from \mathbb{R}^n to \mathbb{R}^m given by a formula of this kind is a linear transformation and the coefficients A(i,j) form the standard matrix of this transformation.

Examples. 1. Consider the transformation of R2 which takes each vector (a,b) to the opposite vector (-a,-b). This is a linear operator with standard matrix : $\displaystyle{ [ -1 0] \\ [0 -1] }$