Recent content by bjgawp

  1. B

    Linear operators, eigenvalues, diagonal matrices

    Ah right. The entries along the diagonal of a upper triangular matrix need not be distinct. So then for example: T(x_1, x_2, x_3) = (2x_1, 2x_2, 3x_3) We only have 2 eigenvalues here but with respect to the standard basis, we would still have a diagonal matrix just with repeated elements...
  2. B

    Linear operators, eigenvalues, diagonal matrices

    Ah what a silly mistake I forgot to write each vector as a linear combination of the basis elements. But then I'm still confused about the whole eigenvalue issue. I know I must be misinterpreting the theorem. The elements along the diagonal are "precisely" the eigenvalues of T. Doesn't this...
  3. B

    Linear operators, eigenvalues, diagonal matrices

    So I have a couple of questions in regards to linear operators and their eigenvalues and how it relates to their matrices with respect to some basis. For example, I want to show that given a linear operator T such that T(x_1,x_2,x_3) = (3x_3, 2x_2, x_1) then T can be represented by a diagonal...
  4. B

    Solve the equation in cartesian and polar form: x^3 + 4\sqrt{1+i} = 0

    I don't really see how that helps as it looks fairly complicated. (1+i)^{\frac{1}{2}} = 2^{\frac{1}{4}} e^{\frac{(8k+1)\pi i}{8}} which gives us two forms: z_0 = 2^{\frac{1}{4}} \left(\cos \tfrac{\pi}{8} + i\sin \tfrac{\pi}{8}\right) z_1 = 2^{\frac{1}{4}} \left(\cos \tfrac{9\pi}{8} +...
  5. B

    Solve the equation in cartesian and polar form: x^3 + 4\sqrt{1+i} = 0

    Homework Statement Solve: x^3 + 4\sqrt{1+i} = 0 and express in both cartesian and polar form. Homework Equations e^{i\theta} = \cos (\theta) + i \sin (\theta) The Attempt at a Solution What I did was move the constant term to the right hand side and squared both sides to get...
  6. B

    Can P(F) be written as a direct sum of two subspaces?

    Sorry to bring up an old thread. I was just wondering how Office_Shredder's approach works. Specifically, how do i directly show that U_e \cap U_o = \bold{0} ? Would I suppose a(x) \in U_e \cap U_o and come to the conclusion that the only way this could occur was if the coefficients were...
  7. B

    Sums of Subspaces: Is Addition Commutative & Associative?

    If U_1, U_2, U_3, are subspaces of V (over fields R and/or C), is the addition of the subspaces commutative and associative? To me it seems rather trivial .. Since their summation is simply the set of all possible sums of the elements of U_1, U_2, U_3, and the elements themselves are...
  8. B

    Can P(F) be written as a direct sum of two subspaces?

    Oh thanks a lot! I just noticed they had this proved a few pages after. Ah well different means to the same goal I suppose.
  9. B

    Can P(F) be written as a direct sum of two subspaces?

    I'm going through Axler's book and just got introduced the concept of sums of subspaces and the direct sums. Here's one of the examples he has. Now the other examples he had were kind of trivial (such as \mathbb{R}^2 = U \oplus W where U = \{ (x,0) | x \in \mathbb{R} \} and W = \{(0,y) |...
  10. B

    Vector Space Q: Is Additive Identity Unique?

    Not quite sure if I entirely understand. Wouldn't any element in S then be called an additive identity since adding it to any vector v would just simply give us v.?
  11. B

    Vector Space Q: Is Additive Identity Unique?

    Doesn't it though for v in S? Let v = (a,b,c) = a + 5b + 3c \in S. Then (a + 5b + 3c) + (0 + 5(3) + 3(-5)) = (a + 5b + 3c) + (0) = a + 5b + 3c = v So we found an element w = (0, 3, -5) such that v + w = v which by definition, we call w the additive inverse?
  12. B

    Vector Space Q: Is Additive Identity Unique?

    But wouldn't they also be the additive identity /zero vector as well, violating the fact zero vectors of vector spaces are unique?
  13. B

    Vector Space Q: Is Additive Identity Unique?

    Just wondering. Suppose we some plane, any plane like S = \{ (x_1, x_2, x_3) \in F^{3} \ : \ x_1 + 5x_2 + 3x_3 = 0 \} where F is either \mathbb{R} or \mathbb{C} . We know that S is a vector space (passes the origin). We know that (0,0,0) is the additive identity and it should be unique by...
  14. B

    Proving the Vector Space Property: cv = 0, v ≠ 0 → c = 0

    I'm considering the problem: Given c \in \bold{F}, v \in V where F is a field and V a vector space, show that cv = 0, v \neq 0 \ \Rightarrow \ c = 0 I've been wrapping my head around this one for a while now but I can't seem to get it. Proving that if cv = 0 and v \neq 0 implies v = 0 is...
  15. B

    Why are two methods for evaluating an integral giving different results?

    Oh whoops! Wow what a silly mistake. Thanks!
Back
Top