Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Creating a matrix with desirable eigenvalues

  1. Nov 12, 2013 #1
    Hello,

    I want to generate a (large) matrix with eigenvalues that are all in a small interval. The relationship between the maximum eigenvalue and minimum eigenvalue should be as small as possible, that's the goal. And the eigenvalues must all be positive.

    Is there any simple way to do this? I'm using Mathcad, and I've built a function to randomly generate a symmetric matrix. And when I want all the eigenvalues positive, I just use the matrix multiplied by it's transpose. But the eigenvalues are scattered in a large interval, and the relationship between the max eigenvalue and min eigenvalue is in the hundreds and thousands. Any tricks to make the eigenvalues get closer to each other?
     
  2. jcsd
  3. Nov 12, 2013 #2

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Are you trying to generate random matrices? Because you can just pick a diagonal matrix if all you want is an arbitrary single such matrix.

    Otherwise pick a diagonal matrix randomly, and conjugate it by a random orthogonal transformation. The eigenvalues are determined by the diagonal entries and the eigenvectors by the orthogonal matrix so by picking them with the appropriate distribution you can get your eigenvectors and eigenvalues to have whatever distribution you want without too much fuss.
     
  4. Nov 12, 2013 #3
    Thanks for replying. So that's the easiest way, right? Could you give a simple example of conjugating a small diagonal matrix by an orthogonal transformation?
     
  5. Nov 12, 2013 #4

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    ^ Let's say you want a 2x2 matrix with eigenvalues 2 and 3. First you form the diagonal matrix

    ##\begin{pmatrix}2&0\\0&3\end{pmatrix}##

    then you multiply this matrix from the left side with the rotation matrix (which is orthogonal)

    ##\begin{pmatrix}\cos\theta&\sin\theta\\-\sin\theta&\cos\theta\end{pmatrix}##

    and finally you multiply the resulting matrix from the right side with the inverse rotation matrix

    ##\begin{pmatrix}\cos\theta&-\sin\theta\\\sin\theta&\cos\theta\end{pmatrix}##

    If the parameter ##\theta## is chosen randomly from interval ##[0,2\pi]##, the eigenvalues are probably not obviously visible from the resulting matrix, but they will still be 2 and 3.
     
  6. Nov 12, 2013 #5
    Thank you both for the help, much appreciated
     
  7. Nov 12, 2013 #6

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Look at the test programs that come with the LAPACK library, and their documentation. There are matrix generation routines that do this type of thing, and for "large" matrices they will probably have better numerical behavior than code you invent for yourself. IIRC they use other numerical methods besides the idea in post #4. http://www.netlib.org/lapack/
     
  8. Nov 13, 2013 #7
    I'm afraid acquinting myself with LAPACK would take too much time.. Oh and it turns out matrix rotation isn't exactly the thing I was looking for, because what I need is a 100x100 matrix, not 2x2 or 3x3..
     
  9. Nov 13, 2013 #8

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    You don't have to use a rotation matrix as the orthogonal transformation. You can form other orthogonal transformations by making NxN matrices that have a set of N orthogonal vectors as their columns. If you want to make a set of N orthogonal vectors in ℝN, just create N random vectors and use the Gram-Schmidt orthogonalization method on them.

    Once you have created the orthogonal matrix ##U## that has the random orthogonal vectors as its columns, you just make the diagonal matrix ##D## that has the desired eigenvalues, and then compute ##U^{-1}DU##, which is a random matrix that has the eigenvalues you want. Remember that for an orthogonal matrix ##U##, ##U^{-1}## is just the transpose of ##U##.

    For more info, see

    http://en.wikipedia.org/wiki/Orthogonal_matrix

    http://en.wikipedia.org/wiki/Gram-schmidt
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Creating a matrix with desirable eigenvalues
Loading...