Levi Woods
- 5
- 1
I am still confused on what exactly entropy is, any help?
Entropy is defined as a measure of missing information based on a probability distribution, particularly in the context of statistical mechanics and quantum mechanics. The foundational formula for entropy, as established by Shannon, is expressed as S = -kB ∑i Pi ln(Pi), where Pi represents the probabilities of outcomes. In quantum mechanics, the entropy is calculated using the statistical operator, leading to the formula S = -kB Tr(ρ ln ρ). For a system in a pure state, the entropy equals zero, indicating complete knowledge of the system.
PREREQUISITESStudents and professionals in physics, particularly those focused on statistical mechanics and quantum mechanics, as well as anyone interested in the mathematical foundations of entropy and information theory.
PeterDonis said:@Levi Woods what sources have you tried to consult about entropy? What have you found difficult to understand about what they say? Your question is much too broad as it stands; focusing in on particular areas of difficulty will make it much more likely that you'll get useful responses.
Levi Woods said:I've just read Wikipedia articles which aren't very helpful, I mean my understanding is that it's the chance that an atom will be in certain place.