Definition of entropy of complex systems is existed?

In summary: This is mostly true, although it's not 100% true - there are cases where fractals can be generated from non-dynamical systems. There's also some that say that fractals can be generated from anything with a sufficiently high complexity.
  • #1
GME
2
0
Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic definition is known for me. Are there any generalization, because lots of books deal with entropy of the Universe and any kind of complex biological systems as well.
 
Biology news on Phys.org
  • #2
This is out of my current field, but I read some papers on entropy in complex systems back in grad school. I recall that Stewart Kauffman was a key researcher in that field. I see in PubMed that he's still working in complex system thermodynamics -- you might look at his stuff. PubMed search "Kauffman SA".
 
  • #4
GME said:
Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic definition is known for me. Are there any generalization, because lots of books deal with entropy of the Universe and any kind of complex biological systems as well.
Complexity has little to do with entropy. Entropy has to do with the probability of being in a particular state. Entropy and complexity are not strongly correlated.
There is no formal definition of complexity that is widely accepted among scientists. There is no formula that determines complexity from the probability. Therefore, there is no way to correlated complexity with probability, either.
There are formal definitions of probability that are accepted among mathematicians. Furthermore, there are relationships between probability and thermodynamics widely accepted among physicists. Therefore, there is some formal connection between probability and thermodynamics. Probability and thermodynamics are commonly used by chemists. However, there is no proven method to link complexity with entropy.
There is an intuition that "complex systems" are "less probable". However, there is no quantitative way to use this relationship. There is no scientific way to prove that such a relationship exists, even though it is a rule of thumb.
Physicists and chemists frequently use thermodynamics and probability theory to analyze biological systems. So far as can be determined scientifically, biological systems satisfy the same physical laws as nonbiological systems. No scientists has ever been able to show that biological systems violate any law of thermodynamics. No one has been able to show that intelligent intervention violates thermodynamics. So far as can be determined by quantitative calculations, biological functions including intelligent action are constrained by the laws of thermodynamics.
So far as has been determined, living things can't decrease the total entropy in an isolated system. The chemical processes of living things always increase the entropy of the system. Reproduction, growth and evolution all appear consistent with all the laws of thermodynamics.
Entropy can and frequently is measured by calorimetric means. Furthermore, probabilities can be determined of many biochemical reactions using computer simulations. For all biochemical reactions capable of being analyzed, the total entropy increases in the system. Entropy can be moved and entropy can be concentrated in small areas. However, the total entropy can not be destroyed.
This seems counter-intuitive when one only pictures the macroscopic scales of systems. However, entropy, includes both macroscopic scale and microscopic scale. The entropy associated with microscopic scales is much larger than the entropy associated with macroscopic scales. Complexity is related to macroscopic scales.
In order to relate complexity to entropy, one has to make the approximation that the microscopic and atomic structures have no contribution to entropy. This approximation is generally not true.
 
  • #5
Pythagorean said:
Please note that the word "complexity" is not found anywhere in this link. Furthermore, there is no evidence provided that the systems analyzed by Kolmogorov have anything to do with biological systems. Therefore, the concepts in this link don't really relate to what the OP is asking.
 
  • #6
Actually, there's lots of evidence the systems analyzed with Kolmogorov entropy have anything to do with both complexity and biological systems, given that Kolmogorov entropy is applied to dynamical systems and there are numerous applications of dynamical systems to both complexity and biological systems. It's a generic measure of a dynamical system.

Kolmogorov himself was quite interested in mathematical biology and information theory, both of which apply to biological systems:
http://www.scholarpedia.org/article/Andrey_Nikolaevich_Kolmogorov
http://www.scholarpedia.org/article/Complexity
 
  • #7
Specific papers using Kolmogorov entropy in biology/medicine:

http://prl.aps.org/abstract/PRL/v89/i6/e068102
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1588208
http://www.sciencedirect.com/science/article/pii/S0167278998001778
http://www.sciencedirect.com/science/article/pii/0167278995003010

A lot of analysis for dynamical systems applies to physiological signals. Even a computation of the fractal dimension of a heart signal can predict failure:

http://www.ncbi.nlm.nih.gov/pubmed/17061937
 

1. What is the definition of entropy of complex systems?

The entropy of a complex system refers to the measure of the disorder or randomness within the system. It is a thermodynamic quantity that describes the number of possible configurations or arrangements that a system can have.

2. How is entropy calculated for complex systems?

The calculation of entropy for complex systems involves using statistical mechanics principles to determine the number of microstates (possible arrangements) that a system can have. This is then used in the equation S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates.

3. What is the relationship between entropy and complexity?

There is a positive correlation between entropy and complexity in a system. As the number of possible configurations or states of a system increases, the system becomes more complex and its entropy also increases.

4. How does the concept of entropy apply to real-world complex systems?

In real-world complex systems, such as biological systems or ecosystems, entropy can be used to understand and predict the behavior and dynamics of these systems. It can also be used to analyze the efficiency and stability of these systems.

5. Can entropy be decreased or reversed in a complex system?

According to the second law of thermodynamics, the overall entropy of a closed system will always increase over time. However, in a localized system, it is possible to decrease or reverse entropy by adding energy or imposing constraints on the system. This is known as negative entropy or negentropy.

Similar threads

Replies
12
Views
1K
  • Biology and Medical
Replies
15
Views
2K
Replies
17
Views
1K
Replies
1
Views
496
  • Beyond the Standard Models
Replies
9
Views
2K
Replies
2
Views
837
Replies
16
Views
831
Replies
7
Views
2K
Replies
45
Views
3K
  • Beyond the Standard Models
Replies
6
Views
2K
Back
Top