Proving Steady State of Initial Vector S0

Click For Summary
SUMMARY

The discussion centers on proving that for any initial state vector S0 = [r0, p0, w0] with legitimate proportions (r0 + p0 + w0 = 1), the subsequent state vectors Sk will converge to Sk = [1/4, 1/2, 1/4] after a long run. The equations used include Sk = b1(Lambda1)^k [X1] + b2(Lambda2)^k [X2] + ... + bn(Lambdan)^k [Xn], and the participants emphasize the importance of linear combinations of eigenvectors in demonstrating this convergence. The proof also highlights that the sum of the components of any initial vector will be preserved if the sum of the components of each column vector of the matrix is 1.

PREREQUISITES
  • Understanding of state vectors and their properties
  • Familiarity with eigenvalues and eigenvectors
  • Knowledge of matrix multiplication and linear combinations
  • Basic grasp of linear algebra concepts
NEXT STEPS
  • Study the properties of eigenvectors and eigenvalues in detail
  • Learn about matrix multiplication and its implications in state transitions
  • Explore proofs involving linear combinations in linear algebra
  • Investigate the application of Markov chains in state vector analysis
USEFUL FOR

Students and professionals in mathematics, particularly those focusing on linear algebra, eigenvalue problems, and state transition models. This discussion is also beneficial for anyone studying Markov processes or related mathematical proofs.

krisrai
Messages
14
Reaction score
0

Homework Statement



Prove that for any initial state vector
S0=
[r0
p0
w0]
as long as you begin with legitamite proportions (so that r0+p0+w0=1), after a long run you get the same result as
Sk=
[1/4
1/2
1/4]

Homework Equations



Sk= b1(Lambda1)^k [X1] + b2(Lambda2)^k[X2]+...+bn(Lambdan)^k[Xn]

The Attempt at a Solution



Ive tried to go about this by showing this works by choosing r0+p0+w0 to =1
and I've showed that if I do not begin with legitimate proportions it will fail

Would anyone be able to help to show this maybe using an actual equation like in a linear combination?
Id really appreciate your help:)
 
Physics news on Phys.org
Another Proof: subsequent vectors

Homework Statement



Prove that for any initial State vector
S0=
[r0
p0
w0]
as long as you begin with legitimate proportions (such that r0+p0+w0=1), the components of subsequent state vectors will also sum to 1

Homework Equations





The Attempt at a Solution



Sk+1= b1 (Lambda1)^k [X1] + b2(Lambda2)^k [X2]+...+bn(Lambdan)^k [Xn]

Ive only come to show this with a previous proof using actual numbers that are legitamite, and failing when the numbers used are not legitimate.
PLEASE! I need your help.
 
Your notation is really not clear here. I don't know what problem you are actually trying to solve.
 
State vector Proof

Homework Statement



1. Prove that for any intitial state vector So= [ro po wo]T, that as long as you begin with legitimate proportions (such that ro+po+wo=1), the components of subsequent state vectors will also sum to 1.

Homework Equations


For example:
if matrix
A= [ ½ ¼ 0
. ½ ½ ½
. 0 ¼ ½ ]
and my eigenvalues are:
\lambda1= 1
\lambda2= 1/2
\lambda3= 0

and the associated eigenvectors are:
X1= [1 2 1] T
X2= [-1 0 1]T
X3= [1 -2 1]T

and the Initial State vector So= [1/2 1/4 1/4]T

and in finding a state vector say for t=1
S(t+1)= A*S(t)
then we need S(1) where t=0

S(1) =
[½ ¼ 0
½ ½ ½
0 ¼ ½ ] X[½ ¼ ¼]T

=[5/16 1/2 3/16]Tandwhere t=1

S(2)= A*S(1)
[½ ¼ 0
½ ½ ½
0 ¼ ½] X [5/16 1/2 3/16]T

= [9/32 1/2 7/32]T

So as you can see these subsequent state vectors sum to 1

The Attempt at a Solution


another way this can be found is written as a linear combination with my eigenvectors
I also found [b1 b2 b3]T = [¼ -⅛ ⅛]T
and Using
S2= b1X1 (λ1)^k + b2X2 (λ2)^k + b3X3 (λ3)^k

= ¼ [1 2 1]T (1^2) -⅛[-1 0 1]T (1/2^2) + --0--

= [9/32 1/2 7/32]T

I don't know how to prove this generally, can anyone help pleasE?
 
Last edited:
You are on the right way. So you can write any initial state vector as a linear combination of the eigen vectors:
\vec S(0) = \lambda \vec v_1 + \mu \vec v_2 + \nu \vec v_3.
Now at a later time t the state vector is
\vec S(t) = A^t \vec S_0
Now use that matrix multiplication is linear and that you know A \vec v_i for the eigenvectors \vec v_i. I didn't fully write out the proof, but it should work.
 
Also see https://www.physicsforums.com/showthread.php?t=239401.
 
Hmm, where have I seen this question before?
Oh wait, it was https://www.physicsforums.com/showthread.php?t=239384 and https://www.physicsforums.com/showthread.php?t=239401.
I doubt posting it three times will get you three times as many answers (probably even three times less, as a lot of people think it's kinda rude).
 
sorry i couldn't find the original place i posted it and it wasnt under my profile i needed to edit the question
but no worries i have the answer solved if anyone wants it just msg me
 
Several threads on the same topic merged.
 
  • #10
If the sum of the components of each column vector of a matrix is 1, then the sum of the components of any initial vector will be preserved. It is important to note in the following proof that the dot product, when take with a vector whose components are all one(denoted \vec 1), is equivalent to the sum of all the components of the other vector. You may be able to adapt this to your purposes.

(A \vec x) \cdot \vec 1 = \vec x \cdot \vec 1

(x_1\vec C_1 + ... + x_n\vec C_n) \cdot \vec 1 = x_1 + ... + x_n

(\sum{x_i\vec C_i}) \cdot \vec 1 = \sum{x_i}

\sum{x_i(\vec C_i \cdot \vec 1)} = \sum{x_i}

\vec C_i \cdot \vec 1 = 1
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
13K
  • · Replies 2 ·
Replies
2
Views
12K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
8K