Optimizing Eigenvector Computations for Large Matrices

In summary, you are working on a method of visualising graphs. This method uses eigenvector computations. You are not sure whether you need to multiply each entry of the matrix with 5, but you are open to other opinions.
  • #1
onako
86
0
* corresponds to matrix product
I'm working on a method of visualising graphs, and that method uses eigenvector computations. For a certain square matrix K (the entries of which are result of C_transpose*C, therefore K is symmetric) I have to compute the eigenvectors.
Since C is mXn, where m>>n, I go with a portion of C (NOT the dot product of the COMPLETE first row of C_transpose with the complete first column of C, but the portion (1/5 perhaps), meaning the dot product of the PORTION of the first row of C_transpose with the portion of the first column of C). In order to get good approximation of the original COMPLETE C_transpose*C, I'm not sure whether I need to multiply each entry of K with 5 (see 1/5 above).
How would the eigenvectors behave if I do not perform the multiplication with 5?

In addition, any other suggestion how to approximate C_transpose*C, where C is mXn matrix, with m>>n are very welcome.
I hope I explained the problem properly.
Thanks
 
Physics news on Phys.org
  • #2
Anyone?
 
  • #3
The answer for this: entries of a real symmetric matrix are scaled, the eigenvalues are scaled, but the eigenvectors stay same.
However, I would really be happy to consider your opinion on this:
Code:
any other suggestion how to approximate C_transpose*C, where C is mXn matrix, with m>>n, are very welcome.
 
  • #4
Let me understand: you want to approximate the dot product of two very long vectors

[tex]x\cdot y=\sum_{i=1}^nx_iy_i[/tex]

by a kind of averaged value

[tex]x\cdot y=5\sum_{i=1}^{n/5}x_iy_i[/tex]

right?
 
  • #5
Yes, but the goal is to have the resulting approximate output matrix having close eigenvectors to the one I would obtain with complete dot products. Note that 5* is not necessary (yes for the approximation of the matrix obtained with complete dot products, but the resulting eigenvectors stay the same). The method is just part of the sophisticated algorithm which shows acceptable results with this approach.

However, I would like to hear other opinions, too.
 
  • #6
Maybe I'm wrong, but it seems to me that the condition m >> n is not necessary for your approach, but only that m is very large. Why should n be small compared to m?
 

1. What are eigenvectors under scaling?

Eigenvectors under scaling refer to a property of eigenvectors where they remain unchanged in direction even when multiplied by a scalar (scaling factor). This means that the eigenvector only changes in magnitude but not in direction when scaled.

2. How do you find eigenvectors under scaling?

To find eigenvectors under scaling, you need to first find the eigenvalues of the matrix. Then, for each eigenvalue, you can solve for its corresponding eigenvector using the eigenvalue-eigenvector equation. The eigenvectors obtained will be eigenvectors under scaling.

3. What is the significance of eigenvectors under scaling?

Eigenvectors under scaling are important in linear algebra as they represent the directions in which a matrix only scales and does not rotate or shear. They also have practical applications in fields such as computer graphics, physics, and engineering.

4. Can a matrix have multiple eigenvectors under scaling?

Yes, a matrix can have multiple eigenvectors under scaling. This is because the scalar multiples of an eigenvector are also eigenvectors. Therefore, a matrix can have an infinite number of eigenvectors under scaling for a given eigenvalue.

5. How are eigenvectors under scaling used in data analysis?

In data analysis, eigenvectors under scaling are used in principal component analysis (PCA) to reduce the dimensionality of a dataset. By choosing eigenvectors under scaling as the principal components, we can capture most of the variability in the data and represent it in a lower-dimensional space.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
794
Replies
3
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
888
Replies
13
Views
2K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
871
  • Linear and Abstract Algebra
Replies
2
Views
914
Back
Top