FOLLOW US ON TWITTER
SHARE THIS PAGE ON FACEBOOK, TWITTER, WHATSAPP ... USING THE BUTTONS ON THE LEFT


YOUR PARTICIPATION FOR THE GROWTH OF PHYSICS REFERENCE BLOG

Sunday, June 22, 2014

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices

  • Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices

20.1 Orthogonal matrices
Let V be an n-dimensional real vector space (that is, over the real numbers ℜ), and let {v1, . . . , vn} be an orthonormal basis for V. Let f : VV be an orthogonal mapping, and let A be its matrix with respect to the basis {v1, . . . , vn}. Then we say that A is an orthogonal matrix.


Theorem 50
The n × n matrix A is orthogonal ⇔ A−1 = At (Recall that if aij is the ij-th element of A, then the ij-th element of At is aji . That is, everything is “flipped over” the main diagonal in A.)

Proof
For an orthogonal mapping f, we have <u, w> = <f(u), f(w)>, for all j and k. But in the matrix notation, the scalar product becomes the inner product. That is, if

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 1

In other words, the matrix whose jk-th element is always <vj , vk> is the n×n identity matrix In. On the other hand,

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 2

That is, we obtain the j-th column of the matrix A. Furthermore, since <vj , vk> = <f(vj), f(vk)>, we must have the matrix whose jk-th elements are <f(vj), f(vk)> being again the identity matrix. So

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 3

But now, if you think about it, you see that this is just one part of the matrix multiplication AtA. All together, we have

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 4

Thus we conclude that A−1 = At. (Note: this was only the proof that f orthogonal ⇒ A−1 = At. The proof in the other direction, going backwards through our argument, is easy, and is left as an exercise for you.)


20.2 Unitary matrices


Theorem 51
The n × n matrix A is unitary ⇔ A−1 = At. (The matrix A is obtained by taking the complex conjugates of all its elements.)

Proof
Entirely analogous with the case of orthogonal matrices. One must note however, that the inner product in the complex case is

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 5

20.3 Hermitian and symmetric matrices


Theorem 52
The n × n matrix A is Hermitian ⇔ A−1 = At.

Proof
This is again a matter of translating the condition <vj, f(vk)> = <f(vj), vk> into matrix notation, where f is the linear mapping which is represented by the matrix A, with respect to the orthonormal basis {v1, . . . , vn}. We have

Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices equation pic 6

In particular, we see that in the real case, self-adjoint matrices are symmetric.

No comments:

Post a Comment

If it's a past exam question, do not include links to the paper. Only the reference.
Comments will only be published after moderation

Currently Viewing: Physics Reference | Linear Algebra: #20 Characterizing Orthogonal, Unitary, and Hermitian Matrices