Sanky123
What is entropy ?, And how it effect the system?
The discussion revolves around the concept of entropy, its definition, and its implications for systems, particularly in relation to molecular behavior and temperature. Participants explore both the physical and informational aspects of entropy, seeking to clarify its meaning and relevance.
Participants express varying definitions and interpretations of entropy, with no consensus reached on its implications or the relationship between entropy and heat loss. The discussion remains unresolved with multiple competing views presented.
Some definitions and explanations provided by participants may depend on specific contexts or assumptions that are not fully articulated. The mathematical relationships discussed are not universally agreed upon and may require further exploration.
This discussion may be of interest to those studying thermodynamics, information theory, or anyone seeking to understand the concept of entropy in both physical and informational contexts.
Have you done ANY research on your own? What do you know about it so far?Sanky123 said:What is entropy ?, And how it effect the system?
Not actualy researched but ,phinds said:Have you done ANY research on your own? What do you know about it so far?
Ok ...Vijay.V.Nenmeli said:Entropy is defined as the amount of disorder in a system.
Thats just a conceptual definition.
For example, take a box full of, say a 100 molecules in a gas phase.
If the temperature is near absolute zero, each molecule can only move in a limited zone.
But, if I up the temperature, to say 373 K, all the molecules start whizzing about in a much more increased span of space. Not just that, but they also collide with each other more frequently and have more disorder, or entropy, as we say.
As you see, the concept of entropy is related to the temoerature of a system.
If the temperature of a body is altered, so is its molecular movement and thus its entropy.
Mathematically, on heating or cooling a body, we take a small frame of time in which the temperature stays constant, and then figure out the really small change in heat dQ. The CHANGE In entropy is then defined as
dQ/T (T is temperature in KELVIN scale.)
For a large change in temperature, we take all the small changes in entropy and add them up. If you're a calculus student, you'll know this is just integrating the expression dQ/T.
What do u mean sir ?Doug Huffman said:To make the conceptual leap to the entropy of information, it is a measure of the number of bits to fully describe those 100 molecules.