FOLLOW US ON TWITTER
SHARE THIS PAGE ON FACEBOOK, TWITTER, WHATSAPP ... USING THE BUTTONS ON THE LEFT


YOUR PARTICIPATION FOR THE GROWTH OF PHYSICS REFERENCE BLOG

Thursday, June 12, 2014

Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized

  • Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized

Definition
Let f : VV be a linear mapping of an n-dimensional vector space into itself. A subspace UV is called invariant with respect to f if f(U) ⊂ U. That is, f(u) ∈ U for all uU.


Theorem 28
Assume that the r dimensional subspace UV is invariant with respect to f : VV. Let A be the matrix representing f with respect to a given basis {v1, . . . , vn} of V. Then A is similar to a matrix A' which has the following form

Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized equation pic 1

Proof
Let {u1, . . . , ur} be a basis for the subspace U. Then extend this to a basis {u1, . . . , ur, ur+1, . . . , un} of V. The matrix of f with respect to this new basis has the desired form.

Definition
Let U1, . . . , UpV be subspaces. We say that V is the direct sum of these subspaces if V = U1+ · · · +Up, and furthermore if v = u1+ · · · +up such that uiUi, for each i, then this expression for v is unique. In other words, if v = u1+ · · · +up = u1'+ · · · +up' with ui'Ui for each i, then ui = ui', for each i. In this case, one writes V = U1 ⊕ · · · ⊕ Up

This immediately gives the following result:


Theorem 29
Let f : VV be such that there exist subspaces UiV, for i = 1, . . . , p, such that V = U1 ⊕ · · · ⊕ Up and also f is invariant with respect to each Ui. Then there exists a basis of V such that the matrix of f with respect to this basis has the following block form.

Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized equation pic 2

where each block Ai is a square matrix, representing the restriction of f to the subspace Ui.

Proof
Choose the basis to be a union of bases for each of the Ui.

A special case is when the invariant subspace is an eigenspace.

Definition
Assume that λ ∈ F is an eigenvalue of the mapping f : VV. The set {vV : f(v) = λv} is called the eigenspace of λ with respect to the mapping f. That is, the eigenspace is the set of all eigenvectors (and with the zero vector 0 included) with eigenvalue λ.


Theorem 30
Each eigenspace is a subspace of V.

Proof
Let u, w ∈ V be in the eigenspace of λ. Let a, b ∈ F be arbitrary scalars. Then we have

f(au + bw) = af(u) + bf(w) = aλu + bλw = λ(au + bw). 

Obviously if λ1 and λ2 are two different (λ1 ≠ λ2 ) eigenvalues, then the only common element of the eigenspaces is the zero vector 0. Thus if every vector in V is an eigenvector, then we have the situation of theorem 29. One very particular case is that we have n different eigenvalues, where n is the dimension of V.


Theorem 31
Let λ1, . . . , λn, be eigenvalues of the linear mapping f : VV, where  λi ≠ λj for i ≠ j. Let v1, . . . , vn be eigenvectors to these eigenvalues. That is, vi0 and f(vi) = λivi, for each i = 1, . . . , n. Then the set {v1, . . . , vn} is linearly independent.

Proof
Assume to the contrary that there exist a1, . . . , an, not all zero, with
a1v1 + · · · + anvn = 0

Assume further that as few of the ai as possible are non-zero. Let ap be the first non-zero scalar. That is, ai = 0 for i < p, and ap≠ 0. Obviously some other ak is non-zero, for some k ≠ p, for otherwise we would have the equation 0 = apvp , which would imply that vp = 0, contrary to the assumption that vp is an eigenvector. Therefore we have

Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized equation pic 3

But, remembering that  λi ≠ λj for i ≠ j, we see that the scalar term for vp is zero, yet all other non-zero scalar terms remain non-zero. Thus we have found a new sum with fewer non-zero scalars than in the original sum with the ais. This is a contradiction.

Therefore, in this particular case, the given set of eigenvectors {v1, . . . , vn} form a basis for V. With respect to this basis, the matrix of the mapping is diagonal, with the diagonal elements being the eigenvalues.

Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized equation pic 4

No comments:

Post a Comment

If it's a past exam question, do not include links to the paper. Only the reference.
Comments will only be published after moderation

Currently Viewing: Physics Reference | Linear Algebra: #11 Eigenvalues, Eigenspaces, Matrices which can be Diagonalized