# Linear Independence Relation

(Redirected from linear independence)

A Linear Independence Relation is a vector set relation (for vector set $\displaystyle{ \{v_1,v_2,\dots, v_n\} }$) where any member vector cannot be represented as linear combination of others.

• AKA: VLIR (Vector-based Linear Independence Relation), Linearly Independent.
• Context:
• It can be implemented by Solving the Equation $\displaystyle{ \alpha_1 v_1+\alpha_2 v_2+\dots +\alpha_n v_n=0 }$, for all scalars $\displaystyle{ \alpha_i=0;i=1,\dots,n }$.
• If $\displaystyle{ V }$ is a vector space then a linearly independent set of vectors in $\displaystyle{ V }$ consisting of a maximum number of vectors in $\displaystyle{ V }$ is called a basis for $\displaystyle{ V }$. The number of vectors of a basis for V equals the dimension of $\displaystyle{ V }$.
• Let $\displaystyle{ v_i=\begin{bmatrix}a_{i1} \\a_{i2} \\ \vdots \\a_{im} \end{bmatrix};i=1,\dots,n }$.

For a set of vectors$\displaystyle{ \{v_1,v_2,\dots,v_n \} }$ with each having m components, the maximum number of linearly independent vectors that a subset of $\displaystyle{ \{v_1,v_2,\dots,v_n \} }$ can have is $\displaystyle{ m }$, where the subset can be $\displaystyle{ \{v_1,v_2,\dots,v_m \} }$ which is a basis for $\displaystyle{ \{v_1,v_2,\dots,v_n \} }$ and of dimension $\displaystyle{ m }$.

Every n set of vectors $\displaystyle{ \{v_1,v_2,\dots,v_n\} }$ with $\displaystyle{ m (\lt n) }$ components are always linearly dependent.

• The Matrix Rank r (a nonnegative integer value) of a matrix $\displaystyle{ A }$ equals the maximum number of linearly independent column vectors of $\displaystyle{ A }$.Since $\displaystyle{ A }$ and its transpose $\displaystyle{ A^T }$have same rank, it can also be said that the rank of a matrix $\displaystyle{ A }$ equals the maximum number of linearly independent row vectors of $\displaystyle{ A }$.
• Example(s):
• $\displaystyle{ \mbox{VLIR} \left( \left\lbrace \begin{bmatrix}1 \\1 \\1 \end{bmatrix}, \begin{bmatrix}0 \\0 \\1 \end{bmatrix} \right\rbrace \right) \rightarrow \mbox{True} }$.
• $\displaystyle{ \mbox{VLIR} \left( \left\lbrace \begin{bmatrix}1 \\1 \\1 \end{bmatrix}, \begin{bmatrix}0 \\0 \\1 \end{bmatrix}, \begin{bmatrix}1 \\0 \\0 \end{bmatrix} \right\rbrace \right) \rightarrow \mbox{True} }$
• $\displaystyle{ \mbox{VLIR} \left( \left\lbrace \begin{bmatrix}1 \\1 \\1 \end{bmatrix}, \begin{bmatrix}0 \\0 \\1 \end{bmatrix}, \begin{bmatrix}1 \\0 \\0 \end{bmatrix}, \begin{bmatrix}0 \\1 \\0 \end{bmatrix} \right\rbrace \right) \rightarrow \mbox{False} }$.
• $\displaystyle{ \mbox{VLIR} \left( \left\lbrace \begin{bmatrix}1 \\0 \\0 \end{bmatrix}, \begin{bmatrix}0 \\1 \\0 \end{bmatrix},\begin{bmatrix}0 \\0 \\1 \end{bmatrix} \right\rbrace \right) \rightarrow \mbox{True} }$.
• $\displaystyle{ \mbox{VLIR} \left( \left\lbrace v_1 = (1, 1), v_2 = (−3, 2) \right\rbrace \right) \rightarrow \mbox{True} }$.
• Let $\displaystyle{ V }$ be a vector space which consist of all the points of the spherical region $\displaystyle{ x^2+y^2+z^2 \leqslant 4 }$, can be denoted as $\displaystyle{ V=\{(x,y,z)^T \in \mathbb{R} : x^2+y^2+z^2 \leqslant 4\} }$.

It can easily be verified that $\displaystyle{ v_1=\begin{bmatrix}1 \\1 \\1 \end{bmatrix} \in V }$. Also $\displaystyle{ v_2=\begin{bmatrix}1 \\0 \\0 \end{bmatrix}, v_3=\begin{bmatrix}0 \\1 \\0 \end{bmatrix}, v_4=\begin{bmatrix}0 \\0 \\1 \end{bmatrix} \in V }$

.

It can also be observed that there are exactly three linearly in dependent vectors $\displaystyle{ \left\lbrace \begin{bmatrix}1 \\0 \\0 \end{bmatrix}, \begin{bmatrix}0 \\1 \\0 \end{bmatrix},\begin{bmatrix}0 \\0 \\1 \end{bmatrix} \right\rbrace }$ whose linear combinations generate all the vectors of $\displaystyle{ V }$. So the set $\displaystyle{ \left\lbrace \begin{bmatrix}1 \\0 \\0 \end{bmatrix}, \begin{bmatrix}0 \\1 \\0 \end{bmatrix},\begin{bmatrix}0 \\0 \\1 \end{bmatrix} \right\rbrace }$ is called the basis for $\displaystyle{ V }$. The dimension of the vector space $\displaystyle{ V }$ is 3, because the number of vectors in the basis set is 3.

For the vector space $\displaystyle{ V }$ if we add another vector belongs to $\displaystyle{ V }$, to the basis set of $\displaystyle{ V }$, then the new set of vectors will be linearly dependent.$\displaystyle{ \left\lbrace \begin{bmatrix}1 \\1 \\1 \end{bmatrix},\begin{bmatrix}1 \\0 \\0 \end{bmatrix}, \begin{bmatrix}0 \\1 \\0 \end{bmatrix},\begin{bmatrix}0 \\0 \\1 \end{bmatrix} \right\rbrace }$ is linearly dependent.

• The matrix $\displaystyle{ A=\begin{bmatrix}1 & 1 & 1 \\1 & 0 & 0 \\0 & 1 & 0 \\0 & 0 & 1 \end{bmatrix} }$ has three linearly independent rows (since row1=row2+row3+row4), so rank of A is 3 (it also has three linearly independent columns).
• Counter-Example(s):
• See: Vector Space, Vector Basis, Hamming Code, Basis Vectors, Linear Combination, Indexed Family, Linearly Dependent Curves, Matrix Rank, Maximally Linearly Independent.

## References

### 2015

• (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/linear_independence Retrieved:2015-12-7.
• In the theory of vector spaces the concept of linear dependence and linear independence of the vectors in a subset of the vector space is central to the definition of dimension. A set of vectors is said to be linearly dependent if one of the vectors in the set can be defined as a linear combination of the other vectors. If no vector in the set can be written in this way, then the vectors are said to be linearly independent. [1]

A vector space can be of finite dimension or infinite dimension depending on the number of linearly independent basis vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space are linearly dependent are central to determining a set of basis vectors for a vector space.

1. G. E. Shilov, Linear Algebra (Trans. R. A. Silverman), Dover Publications, New York, 1977.

### 2009

• http://ltcconline.net/greenl/courses/203/Vectors/basisDimension.htm
• Basis: In our previous discussion, we introduced the concepts of span and linear independence. In a way a set of vectors S = {v1, ..., vk} span a vector space V if there are enough of the right vectors in S, while they are linearly independent if there are no redundancies. We now combine the two concepts.
• Definition of Basis: Let V be a vector space and S = {v1, v2, ..., vk} be a subset of V. Then S is a basis for V if the following two statements are true.
• 1. S spans V.
• 2. S is a linearly independent set of vectors in V.
• We have seen that any vector space that contains at least two vectors contains infinitely many. It is uninteresting to ask how many vectors there are in a vector space. However there is still a way to measure the size of a vector space. For example, R3 should be larger than R2. We call this size the dimension of the vector space and define it as the number of vectors that are needed to form a basis. Tow show that the dimensions is well defined, we need the following theorem.