- #1
Gerenuk
- 1,034
- 5
I want to find a solid argument why entropy always increases and even more important why it is
[tex]S=\sum_i p_i \ln p_i[/tex]
I've seen some more or less sophisticated arguments. What I'd find most convincing is an argument based on general converging Markov processes, that show that the above expression will increase.
For example for a state vector [tex]\textbf{A}=(p_0,p_1,...)[/tex]
[tex]\textbf{A}_n=\textbf{M}^n\textbf{A}_0[/tex]
[tex]\therefore S(\textbf{A}_{n+1})>S(\textbf{A}_n)[/tex]
under special circumstances.
Does such kind of argument exist?
[tex]S=\sum_i p_i \ln p_i[/tex]
I've seen some more or less sophisticated arguments. What I'd find most convincing is an argument based on general converging Markov processes, that show that the above expression will increase.
For example for a state vector [tex]\textbf{A}=(p_0,p_1,...)[/tex]
[tex]\textbf{A}_n=\textbf{M}^n\textbf{A}_0[/tex]
[tex]\therefore S(\textbf{A}_{n+1})>S(\textbf{A}_n)[/tex]
under special circumstances.
Does such kind of argument exist?