Definition of entropy of complex systems is existed?

Click For Summary

Discussion Overview

The discussion revolves around the definition and understanding of entropy in complex systems, exploring whether a general definition exists beyond the traditional thermodynamic perspective. Participants consider various aspects of entropy, including its relationship with complexity and its application in biological systems.

Discussion Character

  • Exploratory
  • Debate/contested
  • Technical explanation

Main Points Raised

  • Some participants express uncertainty about the general definition of entropy, noting that they are familiar only with the thermodynamic definition.
  • One participant mentions that complexity and entropy are not strongly correlated, arguing that there is no widely accepted formal definition of complexity among scientists.
  • Another participant references Kolmogorov-Sinai entropy, suggesting it may relate to the discussion, but questions its relevance to biological systems.
  • In response, some participants argue that Kolmogorov entropy has applications in both complexity and biological systems, citing its use in dynamical systems.
  • There is a claim that living systems cannot decrease total entropy in an isolated system, and that biological processes always increase entropy, aligning with thermodynamic laws.
  • Participants discuss the measurement of entropy through calorimetric means and the use of computer simulations to determine probabilities in biochemical reactions.
  • One participant emphasizes that the relationship between complexity and entropy requires approximations that may not hold true in all cases.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the relationship between entropy and complexity, with multiple competing views presented regarding the relevance of Kolmogorov-Sinai entropy to biological systems and the definitions of complexity.

Contextual Notes

There are limitations in the discussion regarding the definitions of complexity and entropy, as well as the assumptions made about their relationships. The conversation reflects a range of perspectives without resolving the underlying uncertainties.

GME
Messages
2
Reaction score
0
Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic definition is known for me. Are there any generalization, because lots of books deal with entropy of the Universe and any kind of complex biological systems as well.
 
Biology news on Phys.org
This is out of my current field, but I read some papers on entropy in complex systems back in grad school. I recall that Stewart Kauffman was a key researcher in that field. I see in PubMed that he's still working in complex system thermodynamics -- you might look at his stuff. PubMed search "Kauffman SA".
 
GME said:
Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic definition is known for me. Are there any generalization, because lots of books deal with entropy of the Universe and any kind of complex biological systems as well.
Complexity has little to do with entropy. Entropy has to do with the probability of being in a particular state. Entropy and complexity are not strongly correlated.
There is no formal definition of complexity that is widely accepted among scientists. There is no formula that determines complexity from the probability. Therefore, there is no way to correlated complexity with probability, either.
There are formal definitions of probability that are accepted among mathematicians. Furthermore, there are relationships between probability and thermodynamics widely accepted among physicists. Therefore, there is some formal connection between probability and thermodynamics. Probability and thermodynamics are commonly used by chemists. However, there is no proven method to link complexity with entropy.
There is an intuition that "complex systems" are "less probable". However, there is no quantitative way to use this relationship. There is no scientific way to prove that such a relationship exists, even though it is a rule of thumb.
Physicists and chemists frequently use thermodynamics and probability theory to analyze biological systems. So far as can be determined scientifically, biological systems satisfy the same physical laws as nonbiological systems. No scientists has ever been able to show that biological systems violate any law of thermodynamics. No one has been able to show that intelligent intervention violates thermodynamics. So far as can be determined by quantitative calculations, biological functions including intelligent action are constrained by the laws of thermodynamics.
So far as has been determined, living things can't decrease the total entropy in an isolated system. The chemical processes of living things always increase the entropy of the system. Reproduction, growth and evolution all appear consistent with all the laws of thermodynamics.
Entropy can and frequently is measured by calorimetric means. Furthermore, probabilities can be determined of many biochemical reactions using computer simulations. For all biochemical reactions capable of being analyzed, the total entropy increases in the system. Entropy can be moved and entropy can be concentrated in small areas. However, the total entropy can not be destroyed.
This seems counter-intuitive when one only pictures the macroscopic scales of systems. However, entropy, includes both macroscopic scale and microscopic scale. The entropy associated with microscopic scales is much larger than the entropy associated with macroscopic scales. Complexity is related to macroscopic scales.
In order to relate complexity to entropy, one has to make the approximation that the microscopic and atomic structures have no contribution to entropy. This approximation is generally not true.
 
Pythagorean said:
Please note that the word "complexity" is not found anywhere in this link. Furthermore, there is no evidence provided that the systems analyzed by Kolmogorov have anything to do with biological systems. Therefore, the concepts in this link don't really relate to what the OP is asking.
 
Actually, there's lots of evidence the systems analyzed with Kolmogorov entropy have anything to do with both complexity and biological systems, given that Kolmogorov entropy is applied to dynamical systems and there are numerous applications of dynamical systems to both complexity and biological systems. It's a generic measure of a dynamical system.

Kolmogorov himself was quite interested in mathematical biology and information theory, both of which apply to biological systems:
http://www.scholarpedia.org/article/Andrey_Nikolaevich_Kolmogorov
http://www.scholarpedia.org/article/Complexity
 
Specific papers using Kolmogorov entropy in biology/medicine:

http://prl.aps.org/abstract/PRL/v89/i6/e068102
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1588208
http://www.sciencedirect.com/science/article/pii/S0167278998001778
http://www.sciencedirect.com/science/article/pii/0167278995003010

A lot of analysis for dynamical systems applies to physiological signals. Even a computation of the fractal dimension of a heart signal can predict failure:

http://www.ncbi.nlm.nih.gov/pubmed/17061937
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 5 ·
Replies
5
Views
840
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 1 ·
Replies
1
Views
965
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K