Recent content by TomMe

  1. T

    Can the composition of 2 functions be proven using 3 formulas?

    I'm not really well versed in the use of logic, so forgive me if I don't fully understand your explanation above. I know the basic concepts of propositional logic, which is extremely useful when doing proofs. But in this case I seem to have a problem of interpretation. Trying to unite both...
  2. T

    Can the composition of 2 functions be proven using 3 formulas?

    So I can't write down \forall \ a \ \epsilon \ A : \exists ! \ b \ \epsilon \ B : (a,b) \ \epsilon \ f and \forall \ b \ \epsilon \ B : \exists ! \ c \ \epsilon \ C : (b,c) \ \epsilon \ g and use the laws of logic to move around some quantifiers to get => \forall \ a \ \epsilon \ A ...
  3. T

    Can the composition of 2 functions be proven using 3 formulas?

    Well, yes that's what I meant by explaining in words. But is there a way to do this using formulas and laws of logic?
  4. T

    Can the composition of 2 functions be proven using 3 formulas?

    Can anyone show me how to prove exactly that the composition of 2 function is again a function, by using the following 3 formulas? Suppose f: A -> B and g: B -> C are functions, then (1)\forall \ a \ \epsilon \ A : \exists ! \ b \ \epsilon \ B : (a,b) \ \epsilon \ f (2)\forall \ b \...
  5. T

    Why is A(w) - A_R(w) in W_1 in the Triangulization Theorem?

    Okay..assuming I'm right, can anyone tell me why this is important for the proof? Because I can't see it. Proof continued (proof by induction on the dimension): ---------------------- By calculating |tI_{p} - A| using cofactor expansion for the first column we get f_{A}(t) = (t -...
  6. T

    Why is A(w) - A_R(w) in W_1 in the Triangulization Theorem?

    Oh wait, if I use A_{r} on w \epsilon W_{2}, I have to convert that w to its coordinates with respect to the basis of W_{2}. Then these n-1 coordinate numbers match the last n-1 coordinate numbers of A(w) with respect to the basis of V, is that right? So if I substract these coordinates, I...
  7. T

    Why is A(w) - A_R(w) in W_1 in the Triangulization Theorem?

    Okay..I have to substract 2 transformations then. How do I do that? I know that per definition (f+g)(x) = f(x) + g(x), but what are f(x) and g(x) in this case? I can see the coordinates of both bases, but they have a different number of coordinate numbers, right? So I can't substract them...
  8. T

    Why is A(w) - A_R(w) in W_1 in the Triangulization Theorem?

    For me to get to the point I don't understand that well I have to give part of the proof: So suppose \lambda_{1} is an eigenvalue of A and v_{1} an eigenvector with this eigenvalue. Then W_{1} := <v_{1}> is an A-invariant subspace of V. Expand v_{1} to a basis v_{1}, v_{2},..,v_{n} of V...
  9. T

    Singular Value Decomposition trouble

    Sorry, didn't see your second post there mathwonk. I must say I'm not yet familiar with the term isomorphism, it's in my course text but haven't covered it yet. So I read your post 3 times.. :shy: But I think I understand what you mean. I'm going to try this out tomorrow with a couple of...
  10. T

    Singular Value Decomposition trouble

    Thanks George! That explains it! :smile: I'm not sure if I would have seen this myself eventually. Like you guessed, I was stuck on the A A^T and A^T A.. I am aiming to become a physicist, but at the moment I'm not sure how to do this. Tried a few times as full time student, but the stress...
  11. T

    Singular Value Decomposition trouble

    I cannot explain it better than on the webpage I linked to. Singular values appear to be the positive square roots of the eigenvalues of A^T A and A A^T for any sized matrix A. I think they play a similar role to eigenvalues for square matrices. I think SVD is a relative new subject. It...
  12. T

    Linear and Abstract Algebra textbooks

    Don't get me wrong, I have nothing against proofs. If anything, I like to see everything proven also. It's just that I don't see how going over theorem after theorem in class is giving the student real insight into the subject, especially during the first year. How often I just found myself...
  13. T

    Singular Value Decomposition trouble

    I understand. Just thought this was a well known topic and people would immediately recognize what the problem is. Strange. http://en.wikipedia.org/wiki/Singular_value_decomposition
  14. T

    Singular Value Decomposition trouble

    Hm, I guess this is not a popular subject then..
  15. T

    Singular Value Decomposition trouble

    Suppose I want to decompose A = \left(\begin{array}{cc}4&4\\-3&3\end{array}\right). A = U \Sigma V^T => A^T A = V \Sigma^2 V^T and A A^T = U \Sigma^2 U^T So 2 independent eigenvectors of A^T A are a basis for the row space of A and 2 independent eigenvectors of A A^T are a basis for the...
Back
Top