1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof Please Help

  1. Jun 8, 2008 #1
    1. The problem statement, all variables and given/known data

    Prove that for any initial state vector
    S0=
    [r0
    p0
    w0]
    as long as you begin with legitamite proportions (so that r0+p0+w0=1), after a long run you get the same result as
    Sk=
    [1/4
    1/2
    1/4]

    2. Relevant equations

    Sk= b1(Lambda1)^k [X1] + b2(Lambda2)^k[X2]+...+bn(Lambdan)^k[Xn]

    3. The attempt at a solution

    Ive tried to go about this by showing this works by choosing r0+p0+w0 to =1
    and Ive showed that if I do not begin with legitimate proportions it will fail

    Would any one be able to help to show this maybe using an actual equation like in a linear combination?
    Id really appreciate your help:)
     
  2. jcsd
  3. Jun 8, 2008 #2
    Another Proof: subsequent vectors

    1. The problem statement, all variables and given/known data

    Prove that for any initial State vector
    S0=
    [r0
    p0
    w0]
    as long as you begin with legitimate proportions (such that r0+p0+w0=1), the components of subsequent state vectors will also sum to 1

    2. Relevant equations



    3. The attempt at a solution

    Sk+1= b1 (Lambda1)^k [X1] + b2(Lambda2)^k [X2]+...+bn(Lambdan)^k [Xn]

    Ive only come to show this with a previous proof using actual numbers that are legitamite, and failing when the numbers used are not legitimate.
    PLEASE!! I need your help.
     
  4. Jun 8, 2008 #3

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Your notation is really not clear here. I don't know what problem you are actually trying to solve.
     
  5. Jun 9, 2008 #4
    State vector Proof

    1. The problem statement, all variables and given/known data

    1. Prove that for any intitial state vector So= [ro po wo]T, that as long as you begin with legitimate proportions (such that ro+po+wo=1), the components of subsequent state vectors will also sum to 1.

    2. Relevant equations
    For example:
    if matrix
    A= [ ½ ¼ 0
    . ½ ½ ½
    . 0 ¼ ½ ]
    and my eigenvalues are:
    [tex]\lambda[/tex]1= 1
    [tex]\lambda[/tex]2= 1/2
    [tex]\lambda[/tex]3= 0

    and the associated eigenvectors are:
    X1= [1 2 1] T
    X2= [-1 0 1]T
    X3= [1 -2 1]T

    and the Initial State vector So= [1/2 1/4 1/4]T

    and in finding a state vector say for t=1
    S(t+1)= A*S(t)
    then we need S(1) where t=0

    S(1) =
    [½ ¼ 0
    ½ ½ ½
    0 ¼ ½ ] X[½ ¼ ¼]T

    =[5/16 1/2 3/16]T


    andwhere t=1

    S(2)= A*S(1)
    [½ ¼ 0
    ½ ½ ½
    0 ¼ ½] X [5/16 1/2 3/16]T

    = [9/32 1/2 7/32]T

    So as you can see these subsequent state vectors sum to 1



    3. The attempt at a solution
    another way this can be found is written as a linear combination with my eigenvectors
    I also found [b1 b2 b3]T = [¼ -⅛ ⅛]T
    and Using
    S2= b1X1 (λ1)^k + b2X2 (λ2)^k + b3X3 (λ3)^k

    = ¼ [1 2 1]T (1^2) -⅛[-1 0 1]T (1/2^2) + --0--

    = [9/32 1/2 7/32]T

    I dont know how to prove this generally, can anyone help pleasE???
     
    Last edited: Jun 9, 2008
  6. Jun 9, 2008 #5

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

    You are on the right way. So you can write any initial state vector as a linear combination of the eigen vectors:
    [tex]\vec S(0) = \lambda \vec v_1 + \mu \vec v_2 + \nu \vec v_3[/tex].
    Now at a later time t the state vector is
    [tex]\vec S(t) = A^t \vec S_0[/tex]
    Now use that matrix multiplication is linear and that you know [tex]A \vec v_i[/tex] for the eigenvectors [itex]\vec v_i[/itex]. I didn't fully write out the proof, but it should work.
     
  7. Jun 9, 2008 #6

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

  8. Jun 9, 2008 #7

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

    Hmm, where have I seen this question before?
    Oh wait, it was here and here.
    I doubt posting it three times will get you three times as many answers (probably even three times less, as a lot of people think it's kinda rude).
     
  9. Jun 9, 2008 #8
    sorry i couldnt find the original place i posted it and it wasnt under my profile i needed to edit the question
    but no worries i have the answer solved if anyone wants it jsut msg me
     
  10. Jun 10, 2008 #9

    cristo

    User Avatar
    Staff Emeritus
    Science Advisor

    Several threads on the same topic merged.
     
  11. Jun 10, 2008 #10
    If the sum of the components of each column vector of a matrix is 1, then the sum of the components of any initial vector will be preserved. It is important to note in the following proof that the dot product, when take with a vector whose components are all one(denoted [itex]\vec 1[/itex]), is equivalent to the sum of all the components of the other vector. You may be able to adapt this to your purposes.

    [itex](A \vec x) \cdot \vec 1 = \vec x \cdot \vec 1[/itex]

    [itex](x_1\vec C_1 + ... + x_n\vec C_n) \cdot \vec 1 = x_1 + ... + x_n[/itex]

    [itex](\sum{x_i\vec C_i}) \cdot \vec 1 = \sum{x_i}[/itex]

    [itex]\sum{x_i(\vec C_i \cdot \vec 1)} = \sum{x_i}[/itex]

    [itex]\vec C_i \cdot \vec 1 = 1[/itex]
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Proof Please Help
Loading...