Friday, June 20, 2014

Linear Algebra: #18 Orthogonal Bases

  • Linear Algebra: #18 Orthogonal Bases

Our vector space V is now assumed to be either Euclidean, or else unitary — that is, it is defined over either the real numbers ℜ, or else the complex numbers ℂ. In either case we have a scalar product <·,·> : V × V → F (here, F = ℜ or ℂ).

As always, we assume that V is finite dimensional, and thus it has a basis {v1, . . . , vn}. Thinking about the canonical basis for ℜn or ℂn, and the inner product as our scalar product, we see that it would be nice if we had
  • <vj , vj> = 1, for all j (that is, the basis vectors are normalized), and furthermore 
  • <vj , vk> = 0, for all j ≠ k (that is, the basis vectors are an orthogonal set in V). 
Linear Algebra: #18 Orthogonal Bases equation pic 1

That is to say, {v1, . . . , vn} is an orthonormal basis of V. Unfortunately, most bases are not orthonormal. But this doesn’t really matter. For, starting from any given basis, we can successively alter the vectors in it, gradually changing it into an orthonormal basis. This process is often called the Gram-Schmidt orthonormalization process. But first, to show you why orthonormal bases are good, we have the following theorem.

Theorem 48
Let V have the orthonormal basis {v1, . . . , vn}, and let x V be arbitrary. Then

Linear Algebra: #18 Orthogonal Bases equation pic 2

That is, the coefficients of x, with respect to the orthonormal basis, are simply the scalar products with the respective basis vectors.

This follows simply because if x = ∑ ajvj, with (j = 1, ... , n), then we have for each k,

Linear Algebra: #18 Orthogonal Bases equation pic 3

So now to the Gram-Schmidt process. To begin with, if a non-zero vector vV is not normalized — that is, its norm is not one — then it is easy to multiply it by a scalar, changing it into a vector with norm one. For we have <v, v> 0. Therefore ||v|| = √<v, v> > 0 and we have

Linear Algebra: #18 Orthogonal Bases equation pic 4

In other words, we simply multiply the vector by the inverse of its norm.

Theorem 49
Every finite dimensional vector space V which has a scalar product has an orthonormal basis.

The proof proceeds by constructing an orthonormal basis {u1, . . . , un} from a given, arbitrary basis {v1, . . . , vn}. To describe the construction, we use induction on the dimension, n. If n = 1 then there is almost nothing to prove. Any non-zero vector is a basis for V, and as we have seen, it can be normalized by dividing by the norm. (That is, scalar multiplication with the inverse of the norm.)

So now assume that n ≥ 2, and furthermore assume that the Gram-Schmidt process can be constructed for any n−1 dimensional space. Let UV be the subspace spanned by the first n − 1 basis vectors {v1, . . . , vn−1}. Since U is only n − 1 dimensional, our assumption is that there exists an orthonormal basis {u1, . . . , un−1} for U. Clearly, adding in vn gives a new basis {u1, . . . , un−1, vn} for V.

[Since both {v1, . . . , vn−1} and {u1, . . . , un−1} are bases for U, we can write each vj as a linear combination of the uk’s. Therefore {u1, . . . , un−1, vn} spans V, and since the dimension is n, it must be a basis.]

Unfortunately, this last vector, vn, might disturb the nice orthonormal character of the other vectors. Therefore, we replace vn with the new vector (A linear independent set remains linearly independent if one of the vectors has some linear combination of the other vectors added on to it.)

Linear Algebra: #18 Orthogonal Bases equation pic 6

No comments:

Post a Comment

If it's a past exam question, do not include links to the paper. Only the reference.
Comments will only be published after moderation

Currently Viewing: Physics Reference | Linear Algebra: #18 Orthogonal Bases