porcupine6789
- 17
- 0
Can someone explain these to me?
Feldoh said:If w is in the same direction as v it must be some multiple of v. In other words w = \lambda v where lambda is just some scalar value.
If this is the case we can say that Av = \lambda v where lambda is the eigenvalue and v is an eigenvector of the matrix A
maxverywell said:I have some questions about this. If w is in the same direction as v this means that A (which is an operator represented by matrix A) don't change the direction of w, i.e the A don't rotate the vector w. How do we call such operators (or matrices)? Are they only linear transformations?
HallsofIvy said:A linear operator on a vector space is a function, F, such that F(au+ bv)= aF(u)+ bF(v) where u and v are vectors and a and b are scalars.
A matrix is, of course, an array of numbers with specific addition and multiplication defined.
We can think of n by m matrices as linear transformations from the specific vector space R^n to the vector space R^m.
More generally, if F is a linear transformation from a vector space of dimension n to a vector space of dimension m, then, given a specific ordered basis, it can be represented as an n by m matrix.
But linear transformations are not the same as matrices. For one thing, the correspondence depends on a specific basis (the matrices representing the same linear transformation in different bases are "similar" matrices however). Further, there exist infinite dimensional vector spaces in which linear transformations cannot be represented by matrices. One example is the vector space of all polynomials (with the usual definition of addition of polynomials and multiplication by numbers) with the linear operator being differentiation of the polynomial.