- #1
Trollfaz
- 140
- 14
I have learnt about the power iteration for any matrix say A.
How it works is that we start with a random compatible vector v0. We define vn+1 as
vn+1=( Avn)/|max(Avn)|
After an arbitrary large number of iterations vn will slowly converge to the eigenvector associated with the dominant eigenvalue of A. To get the dominant eigenvalue, simply divide Avn by vn. The inverse power iteration can be used to find the least dominant eigenvalue of A by performing the iterations on A-1 assuming A is not singular else, the least dominant eigenvalue of A is automatically 0.
Heres how my function looks like on python:
def dominant_eigen(matrix,precision):
v=random_vector(len(matrix))
for i in range(precision):
v=matrix_vector_product(matrix,v) #takes in A and v and returns Av
v=tuple(map(lambda x:x/infinity_norm_v(v),v)) #infinity_norm_v returns magnitude of the dominant row entry
Av=matrix_vector_product(matrix,v)
return {'eigenvector':v,'eigenvalue':Av[0]/v[0]}
Are there computer methods to get all the eigenvalues of any matrices?
How it works is that we start with a random compatible vector v0. We define vn+1 as
vn+1=( Avn)/|max(Avn)|
After an arbitrary large number of iterations vn will slowly converge to the eigenvector associated with the dominant eigenvalue of A. To get the dominant eigenvalue, simply divide Avn by vn. The inverse power iteration can be used to find the least dominant eigenvalue of A by performing the iterations on A-1 assuming A is not singular else, the least dominant eigenvalue of A is automatically 0.
Heres how my function looks like on python:
def dominant_eigen(matrix,precision):
v=random_vector(len(matrix))
for i in range(precision):
v=matrix_vector_product(matrix,v) #takes in A and v and returns Av
v=tuple(map(lambda x:x/infinity_norm_v(v),v)) #infinity_norm_v returns magnitude of the dominant row entry
Av=matrix_vector_product(matrix,v)
return {'eigenvector':v,'eigenvalue':Av[0]/v[0]}
Are there computer methods to get all the eigenvalues of any matrices?