Tuesday, October 18, 2011

When the Eigenvectors of Matrix Always Orthogonal or Not (With Octave/Matlab Prove)

In Linear Algebra, an orthogonal matrix is defined as matrix with its transpose equal to its inverse,
$$ Q^T = Q^{-1}$$ or
$$ Q.Q^T = Q^T.Q = I $$ How about eigenvectors of Matrix? Are eigenvectors always orthogonal? The short answer is no. Eigenvectors of an arbitrary (but not degenerate) square real matrix A are sure to be independent (if there are no repeated eigenvalues), however they are not necessarily orthogonal.

Are eigenvectors of a symmetric matrix always orthogonal? The short answer is yes. Gilbert Strang gives the following definition: A real matrix has perpendicular eigenvectors if and only if $A^{T}A=AA^{T}$.

It follows that eigenvectors of a symmetric real matrix A (i.e $A =A^T$) are perpendicular. Another nice property of symmetric matrices is that their eigenvalues are real.

Let`s check using Octave/Matlab,
Let us create a 10x10 Hermitian matrix with random complex entries. The randn() function returns random numbers distributed according to the normal distribution with mean zero and standard deviation one. So in Octave/matlab we do,
>> M = randn(10,10)+i*randn(10,10);
Check the orthogonality the eigenvectors of matrix M;
>> [v,e]=eig(M)
Continue with,
>> v'*v
You will be shown that the result is not identity matrix.
The 10x10 matrix of random complex numbers above is *NOT* Hermitian. To make it so, we can symmetrizing it by averaging it with its Hermitian conjugate M':
>> H = (M+M')/2;
H is Hermitian. We diagonalize it to get the eigenvectors v and eigenvalues e with the command
>> [v,e] = eig(H);
Both v and e are 10x10 matrices. v has the eigenvectors in the columns, and e has the eigenvalues on the diagonal of the matrix. You can check that e is diagonal with real entries. You can extract the diagonal with diag(e).

Let us check that the eigenvectors of the Hermitian matrix are orthonormal. For example, let us check that the first and second vectors are orthogonal:
>> v(:,1)'*v(:,2) 
>> ans = 1.0786e-16 - 8.8048e-17i
The way to read this is that we are taking the first and second columns of the matrix v and taking the complex scalar product: the prime ' operator will take the column vector v(:,1) and conjugate-transpose it into a row vector which when multiplied by v(:,2) results in a number (scalar), which is quite close to zero. We can check the orthogonality of all vectors at once by computing all mutual dot products by the condensed formula
>> v'*v
which returns a 10x10 matrix of dot products, which should or be close to the identity matrix.


Related Posts Plugin for WordPress, Blogger...