I want to find a solid argument why entropy always increases and even more important why it is(adsbygoogle = window.adsbygoogle || []).push({});

[tex]S=\sum_i p_i \ln p_i[/tex]

I've seen some more or less sophisticated arguments. What I'd find most convincing is an argument based on general converging Markov processes, that show that the above expression will increase.

For example for a state vector [tex]\textbf{A}=(p_0,p_1,...)[/tex]

[tex]\textbf{A}_n=\textbf{M}^n\textbf{A}_0[/tex]

[tex]\therefore S(\textbf{A}_{n+1})>S(\textbf{A}_n)[/tex]

under special circumstances.

Does such kind of argument exist?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Arrow of time derived from Markov like model?

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**