# Linear Algebra: #4 Linear Independence and Dimension

**Deﬁnition**

Let

**v**, . . . ,

_{1}**v**∈

_{n}**V**be ﬁnitely many vectors in the vector space

**V**over the ﬁeld F. We say that the vectors are linearly dependent if there exists an equation of the form

a

_{1}·**v**+ · · · + a_{1}_{n}·**v**=_{n}**0**,_{i}∈ F are simply zero. If no such non-trivial equation exists, then the set {

**v**, . . . ,

_{1}**v**} ⊂

_{n}**V**is said to be linearly independent.

This definition is undoubtedly the most important idea that there is in the theory of linear algebra!

**Examples**

- In ℜ
^{2}let**v**= (1, 0),_{1}**v**= (0, 1) and_{2}**v**= (1, 1). Then the set {_{3}**v**,_{1}**v**,_{2}**v**} is linearly_{3}*dependent*, since we have

**v**_{1}+ v_{2}- v_{1}= 0

On the other hand, the set {**v**,_{1}**v**} is linearly independent._{2} - In C
_{0}([0, 1], ℜ), let f_{1}: [0, 1] → ℜ be given by f_{1}(x) = 1 for all x ∈ [0, 1]. Similarly, let f_{2}be given by f_{2}(x) = x, and f_{3}is f_{3}(x) = 1 − x. Then the set {f_{1}, f_{2}, f_{3}} is linearly dependent.

Now take some vector space

**V**over a ﬁeld F, and let

**S**⊂

**V**be some subset of

**V**. (The set

**S**can be ﬁnite or inﬁnite here, although we will usually be dealing with ﬁnite sets.) Let

**v**, . . . ,

_{1}**v**⊂

_{n}**S**be some ﬁnite collection of vectors in

**S**, and let a

_{1}, . . . , a

_{n}∈ F be some arbitrary collection of elements of the ﬁeld. Then the sum

a

_{1}·**v**+ · · · + a_{1}_{n}·**v**_{n}is a

*linear combination*of the vectors

**v**, . . . ,

_{1}**v**in

_{n}**S**. The set of all possible linear combinations of vectors in

**S**is denoted by

*span(*, and it is called the linear span of S. One also writes

**S**)**[S]**.

**S**is the generating set of

**[S]**. Therefore if

**[S] = V,**then we say that

**S**is a generating set for

**V**. If

**S**is ﬁnite, and it generates

**V**, then we say that the vector space

**V**is ﬁnitely generated.

**Theorem 5**Given

**S**⊂

**V**, then [

**S**] is a subspace of

**V**.

*Proof*

A simple consequence of theorem 2.

**Examples**

- For any n ∈ ℕ, let

Then S = {**e**,_{1}**e**,. . . ,_{2}**e**} is a generating set for ℜ_{n}^{n}

- On the other hand, the vector space C
_{0}([0, 1], ℜ) is clearly not ﬁnitely generated. (In general such function spaces — which play a big role in quantum ﬁeld theory, and which are studied using the mathematical theory of*functional analysis*— are not ﬁnitely generated. However in this lecture, we will mostly be concerned with ﬁnitely generated vector spaces. )

So let

**S**= {

**v**, . . . ,

_{1}**v**} ⊂

_{n}**V**be a ﬁnite set. From now on in these discussions, we will assume that such sets are ﬁnite unless stated otherwise.

**Theorem 6**Let

**w**= a

_{1}

**v**+ · · · + a

_{1}_{n}

**v**be some vector in [

_{n}**S**] ⊂

**V**, where a

_{1}, . . . , a

_{n}are arbitrarily given elements of the ﬁeld F. We will say that this representation of

**w**is unique if, given some other linear combination,

**w**= b

_{1}

**v**+ · · · + b

_{1}_{n}

**v**, then we must have b

_{n}_{i}= a

_{i}for all i = 1, . . . , n. Given this, then we have that the set

**S**is linearly independent ⇔ the representation of all vectors in the span of

**S**as linear combinations of vectors in

**S**is unique.

*Proof.*

‘⇐’ We certainly have 0

_{}

**v**+ · · · + 0

_{1}_{}

**v**=

_{n}**0**. Since this representation of the zero vector is unique, it follows

**S**is linearly independent.

‘⇒’ Can it be that

**S**is linearly independent, and yet there exists a vector in the span of

**S**which is not uniquely represented as a linear combination of the vectors in

**S**? Assume that there exist elements a

_{1}, . . . , a

_{n}and b

_{1}, . . . , b

_{n}of the ﬁeld F, where a

_{j}≠ b

_{j}, for at least one j between 1 and n, such that

shows that S cannot be a linearly independent set.

**Deﬁnition**

Assume that

**S**⊂

**V**is a ﬁnite, linearly independent subset with [

**S**] =

**V**. Then

**S**is called a basis for

**V**.

**Lemma**

Assume that

**S**= {

**v**, . . . ,

_{1}**v**} ⊂

_{n}**V**is linearly dependent. Then there exists some j ∈ {1, . . . , n}, and elements a

_{i}∈ F, for i ≠ j, such that

*Proof*

Since

**S**is linearly dependent, there exists some non-trivial linear combination of the elements of

**S**, summing to the zero vector,

such that b

_{j}≠ 0, for at least one of the j. Take such a one. Then

**Corollary**

Let

**S**= {

**v**, . . . ,

_{1}**v**} ⊂

_{n}**V**be linearly dependent, and let

**v**be as in the lemma above. Let

_{j}**S'**= {

**v**, . . . ,

_{1}**v**,

_{j-1}**v**, . . . ,

_{j+1}**v**} be

_{n}**S**, with the element

**v**removed.

_{j}

**Theorem 7**Assume that the vector space

**V**is ﬁnitely generated. Then there exists a basis for

**V**.

*Proof*

Since

**V**is ﬁnitely generated, there exists a ﬁnite generating set. Let

**S**be such a ﬁnite generating set which has as few elements as possible. If

**S**were linearly dependent, then we could remove some element, as in the lemma, leaving us with a still smaller generating set for

**V**. This is a contradiction. Therefore

**S**must be a basis for

**V**.

**Theorem 8**Let

**S**= {

**v**, . . . ,

_{1}**v**} be a basis for the vector space

_{n}**V**, and take some arbitrary non-zero vector

**w**∈

**V**. Then there exists some j ∈ {1, . . . , n}, such that

**S'**= {

**v**, . . . ,

_{1}**v**,

_{j-1}**w**,

**v**, . . . ,

_{j+1}**v**}

_{n}is also a basis of

**V**.

*Proof*

Writing

**w**= a

_{1}

**v**+ · · · + a

_{1}_{n}

**v**, we see that since

_{n}**w**≠

**0**, at least one a

_{j}≠ 0. Taking that j, we write

We now prove that [

**S'**] =

**V**. For this, let

**u**∈

**V**be an arbitrary vector. Since

**S**is a basis for

**V**, there exists a linear combination

**u**= b

_{1}

**v**+ · · · + b

_{1}_{n}

**v**. Then we have

_{n}This shows that [

**S'**] =

**V**.

In order to show that

**S'**is linearly independent, assume that we have

for some c, and c

_{i}∈ F. Since the original set

**S**was assumed to be linearly independent, we must have ca

_{i}+ c

_{i}= 0, for all i. In particular, since c

_{j}= 0, we have ca

_{j}= 0. But the assumption was that a

_{j}≠ 0. Therefore we must conclude that c = 0. It follows that also c

_{i}= 0. , for all i ≠ j. Therefore,

**S'**must be linearly independent.

**Theorem 9 (Steinitz Exchange Theorem)**Let

**S**= {

**v**, . . . ,

_{1}**v**} be a basis of

_{n}**V**and let

**T**= {

**w**, . . . ,

_{1}**w**} ⊂

_{m}**V**be some linearly independent set of vectors in

**V**. Then we have m ≤ n. By possibly re-ordering the elements of

**S**, we may arrange things so that the set

**U**= {

**w**, . . . ,

_{1}**w**,

_{m}**v**, . . . ,

_{m+1}**v**}

_{n}is a basis for

**V**.

*Proof*

Use induction over the number m. If m = 0 then

**U**=

**S**and there is nothing to prove.

Therefore assume m ≥ 1 and furthermore, the theorem is true for the case m − 1. So consider the linearly independent set

**T'**= {

**w**, . . . ,

_{1}**w**}. After an appropriate re-ordering of

_{m-1}**S**, we have

**U'**= {

**w**, . . . ,

_{1}**w**,

_{m-1}**v**, . . . ,

_{m}**v**} being a basis for

_{n}**V**. Note that if we were to have n < m, then

**T'**would itself be a basis for

**V**. Thus we could express

**w**as a linear combination of the vectors in

_{m}**T'**. That would imply that

**T**was not linearly independent, contradicting our assumption. Therefore, m ≤ n.

Now since

**U'**is a basis for

**V**, we can express

**w**as a linear combination

_{m}**w**= a

_{m}_{1}

**w**+ · · · + a

_{1}_{m-1}

**w**+ a

_{m-1}_{m}

**v**+ · · · + a

_{m}_{n}

**v**

_{n}If we had all the coefficients of the vectors from S being zero, namely

a

_{m}= a_{m+1}= · · · = a_{n}= 0,_{}then we would have

**w**being expressed as a linear combination of the other vectors in

_{m}**T**. Therefore

**T**would be linearly dependent, which is not true. Thus one of the a

_{j}≠ 0, for j ≥ m. Using theorem 8, we may exchange

**w**for the vector

_{m}**v**in

_{j}**U'**, thus giving us the basis

**U**.

**Theorem 10 (Extension Theorem)**Assume that the vector space

**V**is ﬁnitely generated and that we have a linearly independent subset

**S**⊂

**V**. Then there exists a basis

**B**of

**V**with

**S**⊂

**B**.

*Proof*

If [

**S**] =

**V**then we simply take

**B**=

**S**. Otherwise, start with some given basis

**A**⊂

**V**and apply theorem 9 successively.

**Theorem 11**Let

**U**be a subspace of the (ﬁnitely generated) vector space

**V**. Then

**U**is also ﬁnitely generated, and each possible basis for

**U**has no more elements than any basis for

**V**.

*Proof*

Assume there is a basis

**B**of

**V**containing n vectors. Then, according to theorem 9, there cannot exist more than n linearly independent vectors in

**U**. Therefore

**U**must be ﬁnitely generated, such that any basis for

**U**has at most n elements.

**Theorem 12**Assume the vector space

**V**has a basis consisting of n elements. Then every basis of

**V**also has precisely n elements.

*Proof*

This follows directly from theorem 11, since any basis generates

**V**, which is a subspace of itself.

**Deﬁnition**

The number of vectors in a basis of the vector space

**V**is called the

*dimension*of

**V**, written dim(

**V**).

**Deﬁnition**

Let

**V**be a vector space with subspaces

**X**,

**Y**⊂

**V**. The subspace

**X + Y**= [X ∪ Y ] is called the

*sum*of

**X**and

**Y**. If

**X**∩

**Y**= {

**0**}, then it is the

**direct sum**, written

**X ⊕ Y**.

**Theorem 13 (A Dimension Formula)**Let

**V**be a ﬁnite dimensional vector space with subspaces

**X, Y**⊂

**V**. Then we have

dim(

**X + Y**) = dim(**X**) + dim(**Y**) − dim(**X**∩**Y**).**Corollary**

dim(

**X**⊕

**Y**) = dim(

**X**) + dim(

**Y**).

*Proof of Theorem 13*

Let

**S**= {

**v**, . . . ,

_{1}**v**} be a basis of

_{n}**X**∩

**Y**. According to theorem 10, there exist extensions

**T**= {

**x**, . . . ,

_{1}**x**} and

_{m}**U**= {

**y**, . . . ,

_{1}**y**}, such that

_{r}**S**∪

**T**is a basis for

**X**and

**S**∪

**U**is a basis for

**Y**. We will now show that, in fact,

**S**∪

**T**∪

**U**is a basis for

**X**+

**Y**.

To begin with, it is clear that

**X**+

**Y**= [

**S**∪

**T**∪

**U**]. Is the set

**S**∪

**T**∪

**U**linearly independent? Let

Then we have

**y = −v − x**. Thus

**y ∈ X**. But clearly we also have,

**y ∈ Y**. Therefore

**y ∈ X ∩ Y**. Thus

**y**can be expressed as a linear combination of vectors in

**S**alone, and since

**S ∪ U**is is a basis for

**Y**, we must have c

_{k}= 0 for k = 1, . . . , r. Similarly, looking at the vector

**x**and applying the same argument, we conclude that all the b

_{j}are zero. But then all the a

_{i}must also be zero since the set

**S**is linearly independent.

Putting this all together, we see that the dim(

**X**) = n +m, dim(

**Y**) = n +r and dim(

**X**∩

**Y**) = n. This gives the dimension formula.

**Theorem 14**Let

**V**be a ﬁnite dimensional vector space, and let

**X**⊂

**V**be a subspace. Then there exists another subspace

**Y**⊂

**V**, such that

**V = X**⊕

**Y.**

*Proof*

Take a basis

**S**of

**X**. If [

**S**] =

**V**then we are ﬁnished. Otherwise, use the extension theorem (theorem 10) to ﬁnd a basis

**B**of

**V**, with

**S**⊂

**B**. Then

**Y**= [

**B**\

**S**] satisfies the condition of the theorem. (The notation

**B**\

**S**denotes the set of elements of

**B**which are not in

**S**)

## No comments:

## Post a Comment

If it's a past exam question, do not include links to the paper. Only the reference.

Comments will only be published after moderation