• Support PF! Buy your school textbooks, materials and every day products Here!

Proof Please Help

  • Thread starter krisrai
  • Start date
  • #1
15
0

Homework Statement



Prove that for any initial state vector
S0=
[r0
p0
w0]
as long as you begin with legitamite proportions (so that r0+p0+w0=1), after a long run you get the same result as
Sk=
[1/4
1/2
1/4]

Homework Equations



Sk= b1(Lambda1)^k [X1] + b2(Lambda2)^k[X2]+...+bn(Lambdan)^k[Xn]

The Attempt at a Solution



Ive tried to go about this by showing this works by choosing r0+p0+w0 to =1
and Ive showed that if I do not begin with legitimate proportions it will fail

Would any one be able to help to show this maybe using an actual equation like in a linear combination?
Id really appreciate your help:)
 

Answers and Replies

  • #2
15
0
Another Proof: subsequent vectors

Homework Statement



Prove that for any initial State vector
S0=
[r0
p0
w0]
as long as you begin with legitimate proportions (such that r0+p0+w0=1), the components of subsequent state vectors will also sum to 1

Homework Equations





The Attempt at a Solution



Sk+1= b1 (Lambda1)^k [X1] + b2(Lambda2)^k [X2]+...+bn(Lambdan)^k [Xn]

Ive only come to show this with a previous proof using actual numbers that are legitamite, and failing when the numbers used are not legitimate.
PLEASE!! I need your help.
 
  • #3
Dick
Science Advisor
Homework Helper
26,258
618
Your notation is really not clear here. I don't know what problem you are actually trying to solve.
 
  • #4
15
0
State vector Proof

Homework Statement



1. Prove that for any intitial state vector So= [ro po wo]T, that as long as you begin with legitimate proportions (such that ro+po+wo=1), the components of subsequent state vectors will also sum to 1.

Homework Equations


For example:
if matrix
A= [ ½ ¼ 0
. ½ ½ ½
. 0 ¼ ½ ]
and my eigenvalues are:
[tex]\lambda[/tex]1= 1
[tex]\lambda[/tex]2= 1/2
[tex]\lambda[/tex]3= 0

and the associated eigenvectors are:
X1= [1 2 1] T
X2= [-1 0 1]T
X3= [1 -2 1]T

and the Initial State vector So= [1/2 1/4 1/4]T

and in finding a state vector say for t=1
S(t+1)= A*S(t)
then we need S(1) where t=0

S(1) =
[½ ¼ 0
½ ½ ½
0 ¼ ½ ] X[½ ¼ ¼]T

=[5/16 1/2 3/16]T


andwhere t=1

S(2)= A*S(1)
[½ ¼ 0
½ ½ ½
0 ¼ ½] X [5/16 1/2 3/16]T

= [9/32 1/2 7/32]T

So as you can see these subsequent state vectors sum to 1



The Attempt at a Solution


another way this can be found is written as a linear combination with my eigenvectors
I also found [b1 b2 b3]T = [¼ -⅛ ⅛]T
and Using
S2= b1X1 (λ1)^k + b2X2 (λ2)^k + b3X3 (λ3)^k

= ¼ [1 2 1]T (1^2) -⅛[-1 0 1]T (1/2^2) + --0--

= [9/32 1/2 7/32]T

I dont know how to prove this generally, can anyone help pleasE???
 
Last edited:
  • #5
CompuChip
Science Advisor
Homework Helper
4,302
47
You are on the right way. So you can write any initial state vector as a linear combination of the eigen vectors:
[tex]\vec S(0) = \lambda \vec v_1 + \mu \vec v_2 + \nu \vec v_3[/tex].
Now at a later time t the state vector is
[tex]\vec S(t) = A^t \vec S_0[/tex]
Now use that matrix multiplication is linear and that you know [tex]A \vec v_i[/tex] for the eigenvectors [itex]\vec v_i[/itex]. I didn't fully write out the proof, but it should work.
 
  • #6
CompuChip
Science Advisor
Homework Helper
4,302
47
  • #7
CompuChip
Science Advisor
Homework Helper
4,302
47
Hmm, where have I seen this question before?
Oh wait, it was here and here.
I doubt posting it three times will get you three times as many answers (probably even three times less, as a lot of people think it's kinda rude).
 
  • #8
15
0
sorry i couldnt find the original place i posted it and it wasnt under my profile i needed to edit the question
but no worries i have the answer solved if anyone wants it jsut msg me
 
  • #9
cristo
Staff Emeritus
Science Advisor
8,107
72
Several threads on the same topic merged.
 
  • #10
4
0
If the sum of the components of each column vector of a matrix is 1, then the sum of the components of any initial vector will be preserved. It is important to note in the following proof that the dot product, when take with a vector whose components are all one(denoted [itex]\vec 1[/itex]), is equivalent to the sum of all the components of the other vector. You may be able to adapt this to your purposes.

[itex](A \vec x) \cdot \vec 1 = \vec x \cdot \vec 1[/itex]

[itex](x_1\vec C_1 + ... + x_n\vec C_n) \cdot \vec 1 = x_1 + ... + x_n[/itex]

[itex](\sum{x_i\vec C_i}) \cdot \vec 1 = \sum{x_i}[/itex]

[itex]\sum{x_i(\vec C_i \cdot \vec 1)} = \sum{x_i}[/itex]

[itex]\vec C_i \cdot \vec 1 = 1[/itex]
 

Related Threads for: Proof Please Help

  • Last Post
Replies
5
Views
4K
  • Last Post
Replies
12
Views
3K
  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
6
Views
2K
  • Last Post
Replies
2
Views
685
  • Last Post
Replies
2
Views
959
Top