This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
Matrices of the same size can beÂ addedÂ and subtracted entrywise and matrices of compatible sizes can beÂ multiplied. These operations have many of the properties of ordinary arithmetic, except that matrix multiplication is notÂ commutative, that is,Â ABÂ andÂ BAÂ are not equal in general. Matrices consisting of only one column or row define the components ofÂ vectors, while higher-dimensional (e.g., three-dimensional) arrays of numbers define the components of a generalization of a vector called aÂ tensor. Matrices with entries in otherÂ fieldsÂ orÂ ringsÂ are also studied.
Matrices are a key tool inÂ linear algebra. One use of matrices is to representÂ linear transformations, which are higher-dimensionalanalogsÂ ofÂ linear functionsÂ of the formÂ f(x) =Â cx,Â whereÂ cÂ is a constant; matrix multiplication corresponds toÂ compositionÂ of linear transformations. Matrices can also keep track of theÂ coefficientsÂ in aÂ system of linear equations. For aÂ square matrix, theÂ determinant andÂ invers matrixÂ (when it exists) govern the behavior of solutions to the corresponding system of linear equations, andÂ eigenvalues and eigenvectorsÂ provide insight into the geometry of the associated linear transformation.
Eigenvalues are a special set of scalars associated with aÂ linear system of equationsÂ (i.e., aÂ matrix equation) that are sometimes also known as characteristic roots, characteristic values , proper values, or latent roots .
The determination of the eigenvalues and eigenvectors of a system is extremely important in physics and engineering, where it is equivalent tomatrix diagonalizationÂ and arises in such common applications as stability analysis, the physics of rotating bodies, and small oscillations of vibrating systems, to name only a few. Each eigenvalue is paired with a corresponding so-calledÂ eigenvector (or, in general, a correspondingÂ right eigenvector and a correspondingÂ left eigenvector; there is no analogous distinction between left and right for eigenvalues).
Hermitian matrixÂ (orÂ self-adjoint matrix) is aÂ square matrixÂ withÂ complexÂ entries which is equal to its ownÂ conjugate transposeÂ - that is, the element in theÂ ith row andÂ jth column is equal to theÂ complex conjugateÂ of the element in theÂ jth row andÂ ith column, for all indicesÂ iÂ andÂ j:
If the conjugate transpose of a matrixÂ AÂ is denoted byÂ , then the Hermitian property can be written concisely as
Properties of Hermitian matrices
For two matricesÂ Â we have:
IfÂ Â is Hermitian, then the main diagonal entries ofÂ Â are all real. In order to specify theÂ Â elements ofÂ one may specify freely anyÂ Â real numbers for the main diagonal entries and anyÂ Â complex numbers for the off-diagonal entries;
,Â Â andÂ Â are all Hermitian for allÂ ;
IfÂ Â is Hermitian, thenÂ Â is Hermitian for allÂ . IfÂ Â is nonsingular as well, thenÂ Â is Hermitian;
IfÂ Â are Hermitian, thenÂ Â is Hermitian for all real scalarsÂ ;
Â is skew-Hermitian for allÂ ;
IfÂ Â are skew-Hermitian, thenÂ Â is skew-Hermitian for all real scalarsÂ ;
IfÂ Â is Hermitian, thenÂ Â is skew-Hermitian;
IfÂ Â is skew-Hermitian, thenÂ Â is Hermitian;
AnyÂ Â can be written as
whereÂ Â respectivelyÂ Â are the Hermitian and skew-Hermitian parts ofÂ Â .
Theorem:Â EachÂ Â can be written uniquely asÂ , whereÂ Â andÂ Â are both Hermitian. It can also be written uniquely asÂ , whereÂ Â is Hermitian andÂ Â is skew-Hermitian.
Theorem:Â LetÂ Â be Hermitian. Then
Â is real for allÂ ;
All the eigenvalues ofÂ Â are real; and
Â is Hermitian for allÂ .
Theorem:Â LetÂ Â be given. ThenÂ Â isÂ HermitianÂ if and only if at least one of the following holds:
Â is real for allÂ ;
Â is normal and all the eigenvalues ofÂ Â are real; or
Â is Hermitian for allÂ .
Theorem [the spectral theorem for Hermitian matrices]:Â LetÂ Â be given. ThenÂ Â isÂ HermitianÂ if and only if there are a unitary matrixÂ Â and a real diagonal matrixÂ Â such thatÂ . Moreover,Â Â is real and Hermitian (i.e. real symmetric) if and only if there exist a real orthogonal matrixÂ and a real diagonal matrixÂ Â such thatÂ .
Theorem:Â LetÂ Â be a given family of Hermitian matrices. Then there exists a unitary matrixÂ Â such thatÂ Â is diagonal for allÂ Â if and only ifÂ Â for allÂ .
Positivity of Hermitian matrices
Definition:Â AnÂ Â Hermitian matrixÂ Â is said to beÂ positive definiteÂ if
Â for allÂ
IfÂ , thenÂ Â is said to beÂ positive semidefinite.
The following two theorems give useful and simple characterizations of the positivity of Hermitian matrices.
Theorem:Â A Hermitian matrixÂ Â is positive semidefinite if and only if all of its eigenvalues are nonnegative. It is positive definite if and only if all of its eigenvalues are positive.
In the following we denote byÂ Â the leading principal submatrix ofÂ Â determined by the firstÂ Â rows and columns:.
As for any positive matrix, ifÂ Â is positive definite, thenÂ allÂ principal minors ofÂ Â are positive; whenÂ Â isÂ Hermitian, the converse is also valid. However, an even stronger statement can be made.
Theorem:Â IfÂ Â is Hermitian, thenÂ Â is positive definite if and only ifÂ Â forÂ . More generally, the positivity ofÂ anyÂ nested sequence ofÂ Â principal minors ofÂ Â is a necessary and sufficient condition forÂ Â to be positive definite.
Eigen values of hermitian matrix are always real
Let's take a real symmetric matrix A. The eigenvalue equation is:
Ax = ax
where the eigenvalue a is a root of the characteristic polynomial
p(a) = det(A - aI)
and x is just the corresponding eigenvector of a. The important part
is that x is not 0 (the zero vector).
Well, anyway. Let's calculate the following inner product
(here, x_i* is the complex conjugate of x_i):
<x,Ax> = sum_i x_i* (Ax)_i
= sum_i x_i* (sum_j A_ij x_j)
= sum_i sum_j x_i* A_ij x_j
That's the inner product expanded out, which we'll use later.
But for now, note that since x is an eigenvector, we know that
Ax = ax. We can use this fact to conclude:
<x,Ax> = <x,ax>
= sum_i x_i* (ax)_i
= sum_i x_i* a x_i
= a sum_i x_i* x_i
= a (sum_i |x_i|^2)
Note that sum_i |x_i|^2 is always positive since x is nonzero. We'll
use this fact later, too. Next, we should find the following inner
product (again, y* means complex conjugate of y):
<Ax,x> = sum_i (Ax)_i* x_i
= sum_i (sum_j A_ij x_j)* x_i
= sum_i (sum_j A_ij* x_j*) x_i
= sum_i sum_j x_i A_ij* x_j*
But now, we can use the fact that A^t = A and that A is real. In
particular, that A_ij* = A_ij, and A_ji = A_ij.
<Ax,x> = sum_i sum_j x_i A_ij x_j*
= sum_i sum_j x_i A_ji x_j*
= sum_j sum_i x_j* A_ji x_i
= sum_I sum_J x_I* A_IJ x_J (renaming j->I, i->J)
= sum_i sum_j x_i* A_ij x_j (dummy variables J->j, I->i)
So, because A is real and symmetric, we have A = A^t and
<Ax,x> = <x,Ax>.
Now, take the eigenvalue equation again:
Ax = ax
Now, take the transpose and then complex conjugate:
(Ax)^t = (ax)^t
x^t A^t = a x^t
x^t A = a x^t (since A^t = A)
(x^t A)* = (a x^t)*
(x*)^t A* = a* (x*)^t
(x*)^t A = a* (x*)^t (since A* = A)
Now, just multiply both sides by x, (on the right),
(x*)^t A x = a* (x*)^t x
sum_i (x*)_i (Ax)_i = a* sum_i (x*)_i x_i
sum_i x_i* (sum_j A_ij x_j) = a* sum_i x_i* x_i
sum_i sum_j x_i* A_ij x_j = a* (sum_i |x_i|^2)
<Ax,x> = a* (sum_i |x_i|^2)
But, we already found that <x,Ax> = a (sum_i |x_i|^2),
and that <Ax,x> = <x,Ax>. Therefore,
0 = <Ax,x> - <x,Ax>
= a* (sum_i |x_i|^2) - a (sum_i |x_i|^2)
0 = (a* - a) (sum_i |x_i|^2)
Since sum_i |x_i| > 0, we can divide this last equation by it,
which gives us
0 = a* - a
a = a*
Since a is any eigenvalue of A, we have proven that the complex
conjugate of a is a itself. This can only happen if a is real,
which concludes the proof.
Note that we spent most of the time doing inner product math in the
long-winded explanation given above. All we really wanted to say was
that <x,Ax> = <A'x,x>, where A' is the adjoint matrix to A (adjoint
for matrices means transpose and complex conjugation).
A matrix which is its own adjoint, i.e. A = A', is called self-adjoint
or Hermitian. That's all it means. Clearly, a real Hermitian matrix
is just a symmetric matrix.
Now, the short proof.
Consider the inner product
<u,v> = sum_i u_i* v_i
and let A be a Hermitian matrix. Let x be an eigenvector of A
with eigenvalue a. Then,
<x,Ax> = <x,ax> = a <x,x>
<Ax,x> = <ax,x> = a* <x,x>
Lastly, note that
<x,Ax> = <A'x,x> (adjoint matrix)
= <Ax,x> (since A is self-adjoint)
0 = <Ax,x> - <x,Ax>
= a* <x,x> - a <x,x>
0 = (a* - a) <x,x>
0 = a* - a (we can divide by <x,x> since it's nonzero)
a = a*
Therefore, any eigenvalue a of a Hermitian matrix A is real.
SIMALARLY we can prove det(H-3Ii) cant be zer0
Where H IS HERMITIAN MATRIX and I is unit matrix.
www.ee.imperial.ac.uk/hp/staff/dmb/HYPERLINK "http://www.ee.imperial.ac.uk/hp/staff/dmb/matrix/decomp.html"matrixHYPERLINK "http://www.ee.imperial.ac.uk/hp/staff/dmb/matrix/decomp.html"/decomp.html
www.mathkb.com/.../Negative-HYPERLINK "http://www.mathkb.com/.../Negative-eigenvalues-of-Hermitian-matrices"eigenvalues-of-HermitianHYPERLINK "http://www.mathkb.com/.../Negative-eigenvalues-of-Hermitian-matrices"-matrices