Proving Steady State of Initial Vector S0

If we now consider the vector equation,\vec x(t+1) = A\vec x(t),then we can see that,\vec x(t) = A^t \vec x(0).If we take the dot product with \vec 1 we get,\vec 1 \cdot \vec x(t) = \vec 1 \cdot A^t \vec x(0),where we have used the fact that for any vector \vec y, \vec 1 \cdot \vec y = \vec y \cdot \vec 1.Using the fact that \vec 1 \cdot \vec x(t) = \sum{x_i(t)}, as discussed above, we then
  • #1
15
0

Homework Statement



Prove that for any initial state vector
S0=
[r0
p0
w0]
as long as you begin with legitamite proportions (so that r0+p0+w0=1), after a long run you get the same result as
Sk=
[1/4
1/2
1/4]

Homework Equations



Sk= b1(Lambda1)^k [X1] + b2(Lambda2)^k[X2]+...+bn(Lambdan)^k[Xn]

The Attempt at a Solution



Ive tried to go about this by showing this works by choosing r0+p0+w0 to =1
and I've showed that if I do not begin with legitimate proportions it will fail

Would anyone be able to help to show this maybe using an actual equation like in a linear combination?
Id really appreciate your help:)
 
Physics news on Phys.org
  • #2
Another Proof: subsequent vectors

Homework Statement



Prove that for any initial State vector
S0=
[r0
p0
w0]
as long as you begin with legitimate proportions (such that r0+p0+w0=1), the components of subsequent state vectors will also sum to 1

Homework Equations





The Attempt at a Solution



Sk+1= b1 (Lambda1)^k [X1] + b2(Lambda2)^k [X2]+...+bn(Lambdan)^k [Xn]

Ive only come to show this with a previous proof using actual numbers that are legitamite, and failing when the numbers used are not legitimate.
PLEASE! I need your help.
 
  • #3
Your notation is really not clear here. I don't know what problem you are actually trying to solve.
 
  • #4
State vector Proof

Homework Statement



1. Prove that for any intitial state vector So= [ro po wo]T, that as long as you begin with legitimate proportions (such that ro+po+wo=1), the components of subsequent state vectors will also sum to 1.

Homework Equations


For example:
if matrix
A= [ ½ ¼ 0
. ½ ½ ½
. 0 ¼ ½ ]
and my eigenvalues are:
[tex]\lambda[/tex]1= 1
[tex]\lambda[/tex]2= 1/2
[tex]\lambda[/tex]3= 0

and the associated eigenvectors are:
X1= [1 2 1] T
X2= [-1 0 1]T
X3= [1 -2 1]T

and the Initial State vector So= [1/2 1/4 1/4]T

and in finding a state vector say for t=1
S(t+1)= A*S(t)
then we need S(1) where t=0

S(1) =
[½ ¼ 0
½ ½ ½
0 ¼ ½ ] X[½ ¼ ¼]T

=[5/16 1/2 3/16]T


andwhere t=1

S(2)= A*S(1)
[½ ¼ 0
½ ½ ½
0 ¼ ½] X [5/16 1/2 3/16]T

= [9/32 1/2 7/32]T

So as you can see these subsequent state vectors sum to 1



The Attempt at a Solution


another way this can be found is written as a linear combination with my eigenvectors
I also found [b1 b2 b3]T = [¼ -⅛ ⅛]T
and Using
S2= b1X1 (λ1)^k + b2X2 (λ2)^k + b3X3 (λ3)^k

= ¼ [1 2 1]T (1^2) -⅛[-1 0 1]T (1/2^2) + --0--

= [9/32 1/2 7/32]T

I don't know how to prove this generally, can anyone help pleasE?
 
Last edited:
  • #5
You are on the right way. So you can write any initial state vector as a linear combination of the eigen vectors:
[tex]\vec S(0) = \lambda \vec v_1 + \mu \vec v_2 + \nu \vec v_3[/tex].
Now at a later time t the state vector is
[tex]\vec S(t) = A^t \vec S_0[/tex]
Now use that matrix multiplication is linear and that you know [tex]A \vec v_i[/tex] for the eigenvectors [itex]\vec v_i[/itex]. I didn't fully write out the proof, but it should work.
 
  • #6
Also see https://www.physicsforums.com/showthread.php?t=239401.
 
  • #7
Hmm, where have I seen this question before?
Oh wait, it was https://www.physicsforums.com/showthread.php?t=239384 and https://www.physicsforums.com/showthread.php?t=239401.
I doubt posting it three times will get you three times as many answers (probably even three times less, as a lot of people think it's kinda rude).
 
  • #8
sorry i couldn't find the original place i posted it and it wasnt under my profile i needed to edit the question
but no worries i have the answer solved if anyone wants it just msg me
 
  • #9
Several threads on the same topic merged.
 
  • #10
If the sum of the components of each column vector of a matrix is 1, then the sum of the components of any initial vector will be preserved. It is important to note in the following proof that the dot product, when take with a vector whose components are all one(denoted [itex]\vec 1[/itex]), is equivalent to the sum of all the components of the other vector. You may be able to adapt this to your purposes.

[itex](A \vec x) \cdot \vec 1 = \vec x \cdot \vec 1[/itex]

[itex](x_1\vec C_1 + ... + x_n\vec C_n) \cdot \vec 1 = x_1 + ... + x_n[/itex]

[itex](\sum{x_i\vec C_i}) \cdot \vec 1 = \sum{x_i}[/itex]

[itex]\sum{x_i(\vec C_i \cdot \vec 1)} = \sum{x_i}[/itex]

[itex]\vec C_i \cdot \vec 1 = 1[/itex]
 

Suggested for: Proving Steady State of Initial Vector S0

Replies
1
Views
936
Replies
6
Views
1K
Replies
9
Views
304
Replies
5
Views
292
Replies
4
Views
510
Replies
10
Views
638
Back
Top