[講義] RMT2018 Spring Lectures (Percy A. Deift): Lecture 03
來自專欄隨機矩陣不隨機8 人贊了文章
Let denote the set of real n x n symmetric matrices. Suppose has eigenvalues and associated eigenvectors .
Let for all eigenvectors of M,
Claim 1:
is a dense, open set in of full measure, i.e., .
Note that
(42.1)
Indeed if and , then associated with there are at least two independent eigenvectors , , from which it follows that there is an eigenvector , , with . This is a contradiction and so (42.1) is true.
Let
For an n x n matrix M, let denote the (n-1) x (n-1) matrix obtained by deleting the first row and column of M.
Claim 2:
and have a common eigenvalue}
Clearly for if with , then is an eigenvector for , . Conversely, suppose that , but . Let be a common eigenvalue of M and , then for some and as , . Without loss, suppose . Write
where and . As , we have in particular,
and so
Now such that
and so
But then
and so M has an eigenvector with . This is a contradiction and so .
Note: By the proof of Claim 2, we in fact see that if , then M and have a "common" eigenvector i.e., an eigenvector for , , such that is an eigenvector for M, . Note however that not every eigenvector of has the property that is an eigenvector of M.
For example
if , , then
and is an eigenvector of . However is an eigenvector of M if and only if is an multiple of
We now show that has measure 0. Let
Write
and consider the resultant R of p and q
where is repeated n-1 times and is repeated n times. Clearly R is a determinant of size n+(n-1) = 2n-1. By standard theory (exercise) for any 2 polynomials p and q
and have a common root
Now R = R(M) is a real analytic function (in fact a polynomial) in the entries of M and hence if it vanishes on a set of positive measure in , it is identically zero (exercise). But
does not have a common eigenvalue with , specially . Indeed, if , , for , then . But the , , , so . But then from the first row of (M - j) we see that . Thus . Hence we must have .
example
Suppose
2n-1 = 4 - 1 = 3
Then
so if is a root of and also of , we see that R = 0.
Exercise:
Generalize R(p, q) for any 2 polynomials of arbitrary order.
Exercise:
Instead of R, we can consider associated tensors.
For a polynomial , the companion matrix for p is defined as follows,
, is n x n.
Then (exercise) p(z) = 0 if and only if z is an eigenvalue of , i.e., .
If is a second polynomial, with companion matrix , is m x m, set
Show that
and so det A = 0 p and q have a common root.
If , , we see that det A is a polynomial in the entries of M, and the argument proceeds as before for R.
Finally, we show that is dense and open, which completes the proof of Claim 1.
If , then has distinct spectrum (see Claim 1) and for all eigenvectors of M. But as the spectrum of is simple it follows by standard spectral theory that the eigenvalues and eigenvectors are continuous for M in a neighborhood of M. This shows that is open. On the other hand, as , is certainly dense. We are done.
It follows from the above calculations that the spectral theorem
,
induces a well-defined and smooth map
from into ,
where is the self-orthogonal matrices whose columns have positive first entries, ,
and where is the vector in
.
In fact is a bijection with a smooth inverse : Indeed if
then and and
so
so is . Also if and , then is a real symmetric matrix with simple spectrum.
Moreover the column of O are clearly the eigenvectors of M and . Hence and clearly . This is a bijection. Clearly the inverse of is given by
and is smooth.
Now as is an open set of full measure, we can use as a change of variables to compute probabilities. The first order of business is to compute the Jacobian
on .
Fix and let
Let , , be local co-coordinates on in a small neighborhood of ,
where
For sufficiently small , with
are co-coordinates for an open neighborhood of
Differentiating w.r.t. ,
we find
.
But . Hence and so
(53.1)
is skew-symmetric. It follows that for
(53.2)
Similarly,
(53.3)
Consider the map
(53.4)
mapping real symmetric matrices to real symmetric matrices. Let f denote the bijection of real symmetric N x N matrices onto given by (cf , 8.29)
Define the inner product on as before by
(54.1)
where . We find for ,
It follows that
(54.2)
where is orthogonal and as before
(54.3)
More precisely, if the matrix represents in an orthonormal basis for , i.e., , then is orthogonal and so . Now (53.2) (53.3) can be written in the form
(54.4)
where the elements on the RHS are N x N matrices,
or
(55.1)
where the elements on the RHS are column vectors of size N(N+1)/2.
Hence by (54.3) and the representation ,
(55.2)
Now,
(55.3)
where 1 is at the place. Also for and so for
(55.4)
with 0s in the first N entries. Thus graphically we have
(56.0)
where X denotes the matrix
(56.1)
ans so
which implies by (55.2) (56.0)
(56.2)
As and are smooth, (why?)
推薦閱讀: