Linear independence
Linear independence is a concept from linear algebra. It is used to talk about vector spaces. Each vector space has a null vector. This vector is expressed as a linear combination (a sum) of other vectors. A set of these vectors is called linearly independent if and only if all of them are needed to express this null vector. This is equivalent to saying that at least one of the vectors can be expressed as a linear combination of the others. If the vectors are not linearly independent, they are called linearly dependent.
As an example, take the three-dimensional Euclidean space:
- [math]\displaystyle{ \begin{matrix} \mbox{independent}\qquad\\ \underbrace{ \overbrace{ \begin{bmatrix}0\\0\\1\end{bmatrix}, \begin{bmatrix}0\\2\\-2\end{bmatrix}, \begin{bmatrix}1\\-2\\1\end{bmatrix} }, \begin{bmatrix}4\\2\\3\end{bmatrix} }\\ \mbox{dependent}\\ \end{matrix} }[/math]
Here the first three vectors are linearly independent; but the fourth vector equals nine times the first plus five times the second plus four times the third, so the four vectors together are linearly dependent. Linear dependence is a property of the family, not of any particular vector; for example in this case we could just as well write the first vector as a linear combination of the last three.
- [math]\displaystyle{ \bold{v}_1 = \left(-\frac{5}{9}\right) \bold{v}_2 + \left(-\frac{4}{9}\right) \bold{v}_3 + \frac{1}{9} \bold{v}_4 . }[/math]