- #1
Nick Prince
- 20
- 0
It is assumed that entropy increases in the universe. However, the fluid and acceleration equations are derived assuming that.
TdS=dE+pdV where dQ = TdS.
But dQ is usually set equal to zero to derive these equations. Hence since T is non zero, dS should be zero and so there would be no increase in entropy. This seems to conflict with the initial assumption. Can anyone fault my logic here.?
TdS=dE+pdV where dQ = TdS.
But dQ is usually set equal to zero to derive these equations. Hence since T is non zero, dS should be zero and so there would be no increase in entropy. This seems to conflict with the initial assumption. Can anyone fault my logic here.?