How do living beings evolve in a universe with increasing entropy?

  • Thread starter Thread starter Deepak K Kapur
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
The discussion centers on the apparent contradiction between the evolution of ordered living beings and the second law of thermodynamics, which states that entropy in a closed system must increase. Participants clarify that Earth is not a closed system, as it receives energy from the sun, allowing for localized decreases in entropy. They emphasize that equilibrium, often misconceived as an ordered state, is actually a state of maximum entropy and disorder. The conversation also touches on the distinction between thermodynamic entropy and the concept of entropy as a measure of our knowledge about a system. Ultimately, understanding the rigorous definitions of entropy is crucial for resolving these conceptual challenges.
Deepak K Kapur
Messages
164
Reaction score
5
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
 
Science news on Phys.org
The Earth is not a closed system. It gets energy from the sun.
 
  • Like
Likes koushik pilla
Deepak K Kapur said:
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?

Within the closed system, there can be pockets where entropy decreases, where that is balanced out by pockets where entropy increases. It is just that the TOTAL overall entropy increases.

Zz.
 
  • Like
Likes rude man
ZapperZ said:
Within the closed system, there can be pockets where entropy decreases, where that is balanced out by pockets where entropy increases. It is just that the TOTAL overall entropy increases.

Zz.

Any experimental proof for this statement.
 
Deepak K Kapur said:
Any experimental proof for this statement.

Us. We exist.
 
  • Like
Likes rude man and Dale
Deepak K Kapur said:
Any experimental proof for this statement.

Drop an ice cube into a cup of warm water. Compute!

Zz.
 
  • Like
Likes rude man and Dale
ZapperZ said:
Drop an ice cube into a cup of warm water. Compute!

Zz.

OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?
 
Deepak K Kapur said:
OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?

You have a strange way of defining "equilibrium". In fact, I find many of the stuff you've "acquired" along the way in many of your posts to be rather strange.

"Equilibrium" simply means, in this case, d(something)/dt = 0.

You need to really look and read the PHYSICS (not the pedestrian) definition of entropy. How about starting with reading the stuff they have on the entropy site:

http://entropysite.oxy.edu/

Zz.
 
  • Like
Likes rude man
  • #10
Deepak K Kapur said:
Any experimental proof for this statement.
Heat pumps work.
 
  • #11
ZapperZ said:
In fact, I find many of the stuff you've "acquired" along the way

Zz.

I think 99.999999% of people on this planet only acquire stuff along the way...

It's only a miniscule minority that ever says something new...
 
  • #12
Deepak K Kapur said:
Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).
I think this may be a big source of your conceptual unease. You have this exactly backwards. Thermal equilibrium is not an ordered state.

Rigorously, you should stick with the standard and unambiguous notion of entropy, and not your colloquial concept of order. However, if you do insist on thinking in colloquial terms then you at least need to think carefully about your concept.

Colloquially, order is when the books are in the book case, the clean laundry is in the drawers, and the dirty laundry is in the hamper while disorder is when they are all on the floor. Order has boundaries and separations and disorder is uniform. Equilibrium is disorder.

Saying that equilibrium is ordered is wrong both rigorously and colloquially.
 
  • Like
Likes jerromyjon, mishima and NFuller
  • #13
Dale said:
I think this may be a big source of your conceptual unease. You have this exactly backwards. Thermal equilibrium is not an ordered state.

Rigorously, you should stick with the standard and unambiguous notion of entropy, and not your colloquial concept of order. However, if you do insist on thinking in colloquial terms then you at least need to think carefully about your concept.

Colloquially, order is when the books are in the book case, the clean laundry is in the drawers, and the dirty laundry is in the hamper while disorder is when they are all on the floor. Order has boundaries and separations and disorder is uniform. Equilibrium is disorder.

Saying that equilibrium is ordered is wrong both rigorously and colloquially.

If icing is to be spread on a cake and we do it in lumps (boundaries), it is disordered..

But, if we spread the icing evenly(uniformly), it is 'ordered'...
 
  • #14
Deepak K Kapur said:
If icing is to be spread on a cake and we do it in lumps (boundaries), it is disordered
You are confusing "ugly" with "disordered". Just because you have an aesthetic preference for smooth rather than lumps doesn't mean it is more ordered. If you start with big lumps with sharp boundaries and randomly perturb it (e.g. heat or vibration) then you can get smooth icing and the boundaries will reduce. If you start with smooth icing and randomly perturb it then you will not suddenly get big lumps with sharp boundaries. Despite your dislike for such lumpy icing, it is in fact more ordered.

It is clear that your intuitive concept of "disordered" is simply wrong. This is one of the reasons why we develop rigorous quantitative definitions. Please stick with the technical concept of entropy. Your intuition for this concept will improve over time, but right now you need to use the rigorous definition as you work to correct some faulty assumptions.

Don't worry and don't give up. This sort of thing happens all the time, and it can be overcome by consistently relying on the rigorous definition until the intuition builds later.
 
Last edited:
  • Like
Likes nasu
  • #15
Dale said:
Don't worry and don't give up.

So, I ask further...

To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

So, it seems our 'inability' is translated as disorder...

Hope, there is some sense in my pedestrian views.
 
  • #16
Deepak K Kapur said:
To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.
What does the definition of entropy say? Apply the actual rigorous definition to the situation.
 
Last edited:
  • #17
Deepak, why do you come here? When you're told you're wrong, you just dig in harder. I'm afraid that doesn't make you right. It also isn't the behavior of a student, it's the behavior of a crackpot. Finally and most importantly, the process of learning is exchanging wrong ideas for right ones. If you don't give up the wrong ideas, you're not learning, and just wasting your time - and everybody else's.

Redefining entropy makes discussion impossible. As I wrote a week ago:

Humpty Dumpty smiled contemptuously. 'Of course you don't — till I tell you. I meant "there's a nice knock-down argument for you!"'

'But "glory" doesn't mean "a nice knock-down argument",' Alice objected.

'When I use a word,' Humpty Dumpty said, in rather a scornful tone, 'it means just what I choose it to mean — neither more nor less.'

'The question is,' said Alice, 'whether you can make words mean so many different things.'

'The question is,' said Humpty Dumpty, 'which is to be master — that's all.'

Alice was too much puzzled to say anything; so after a minute Humpty Dumpty began again. 'They've a temper, some of them — particularly verbs: they're the proudest — adjectives you can do anything with, but not verbs — however, I can manage the whole lot of them! Impenetrability! That's what I say!'

And here we are, Humpty-Dumpting away again. If you don't use standard definitions, we cannot communicate.
 
  • Like
Likes NFuller and Dale
  • #18
Deepak K Kapur said:
So, I ask further...

To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

So, it seems our 'inability' is translated as disorder...

Hope, there is some sense in my pedestrian views.

On one hand, there is the idea of entropy as expressing our lack of knowledge of the system. On the other hand, we have the thermodynamics idea of entropy, which feature in the 3 laws of thermodynamics. They're related but maybe a little bit different.

If we had a system of particles in equilibrium, the thermodynamic entropy is a definite calculated nonzero value. But thinking of entropy as being our lack of knowledge, if we could see all the particles positions, the system has zero entropy.

As another example, if you have spheres in a box, at low density they are very randomly arranged, but if you decrease the size of the box, they order into a lattice pattern. So, the knowledge entropy decreases. But, the system has no energy involved, so it's not so clear how to interpret this in terms of the thermodynamic definition of entropy.
 
  • #19
BruceW said:
On one hand, there is the idea of entropy as expressing our lack of knowledge of the system. On the other hand, we have the thermodynamics idea of entropy, which feature in the 3 laws of thermodynamics. They're related but maybe a little bit different.

No, the concept of STATISTICS is the expression of our lack of knowledge of every individual particles in the system, not just entropy. So the whole field of thermodynamics and statistical mechanics are included in that. This is not the definition of entropy.

If we had a system of particles in equilibrium, the thermodynamic entropy is a definite calculated nonzero value. But thinking of entropy as being our lack of knowledge, if we could see all the particles positions, the system has zero entropy.

Where did you get that?

Again, as with the wrong idea exhibited by the OP, entropy is NOT disorder, or even a lack of knowledge.

http://news.fnal.gov/2013/06/entropy-is-not-disorder/
http://entropysite.oxy.edu/entropy_isnot_disorder.html
http://www2.ucdsb.on.ca/tiss/stretton/CHEM2/entropy_new_1.htm
http://home.iitk.ac.in/~osegu/Land_PhysLettA.pdf

Zz.
 
  • #20
This wiki article https://en.wikipedia.org/wiki/Entropy_(information_theory) is the kind of entropy I'm thinking about when I say the amount of disorder, or lack of knowledge. They actually use the word 'surprisal' but it is maybe a bit intimidating for beginners, which is why people say disorder instead (I would guess).

So yes, this information definition of entropy would be one way to express our lack of information and there are many other statistics that you could choose.

Thermodynamics and statistical mechanics are different disciplines. They should agree on all concepts where they overlap, but you get so much weird stuff happening in statistical mechanics that I feel it's best to specify if someone is talking about statistical mechanics or thermodynamics at the outset, for clarity.
 
  • #21
BruceW said:
This wiki article https://en.wikipedia.org/wiki/Entropy_(information_theory) is the kind of entropy I'm thinking about when I say the amount of disorder, or lack of knowledge. They actually use the word 'surprisal' but it is maybe a bit intimidating for beginners, which is why people say disorder instead (I would guess).

So yes, this information definition of entropy would be one way to express our lack of information and there are many other statistics that you could choose.
You are taking a very lose interpretation of the article which does not clarify anything for the OP.
BruceW said:
Thermodynamics and statistical mechanics are different disciplines. They should agree on all concepts where they overlap, but you get so much weird stuff happening in statistical mechanics that I feel it's best to specify if someone is talking about statistical mechanics or thermodynamics at the outset, for clarity.
Thermodynamics and statistical mechanics are the respective macroscopic and microscopic theories for the same physical process. They are not different theories but are instead quite complimentary. Entropy in particular, on the macroscopic scale, is defined such that its differential change is equal to the differential change of heat in a system at constant temperature
$$\frac{\delta Q}{T}=\delta S$$
Notice that there is nothing in this equation that is directly implying disorder or "lack of information".
On the microscopic scale, entropy is defined as proportional to the natural logarithm of the number of micro states ##W## at equilibrium
$$S=k\text{ln}(W)$$
Again, this is not implying "disorder". It says that entropy is related to the number of ways one can organize the constituent particles of a system while still maintaining the macroscopic configuration.
 
  • Like
Likes Dale
  • #22
hm, as an example, the free energy of the 1D Ising model is $$f(\beta ,h)=-\lim _{L\to \infty }{\frac {1}{\beta L}}\ln(Z(\beta ))=-{\frac {1}{\beta }}\ln \left(e^{\beta J}\cosh \beta h+{\sqrt {e^{2\beta J}(\sinh \beta h)^{2}+e^{-2\beta J}}}\right)$$ I would prefer to say this macroscopic quantity is a result from statistical mechanics rather than thermodynamics, but I guess it just depends on the preference. Probably you could list it in either sections of a journal.

Also, maybe these articles are a bit better than the other one I linked to https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics) I think our main difference is that I would also like to interpret entropy as being related to information, but you would prefer not to.
 
  • #23
The Wikipedia disambiguation page on entropy shows 16 scientific definitions (presumably all correct). IMO that makes entropy particularly hard to discuss. People talk past each other with differing definitions in their heads.
 
  • #24
BruceW said:
I think our main difference is that I would also like to interpret entropy as being related to information, but you would prefer not to.
anorlunda said:
The Wikipedia disambiguation page on entropy shows 16 scientific definitions (presumably all correct). IMO that makes entropy particularly hard to discuss. People talk past each other with differing definitions in their heads.
I think our goal here should be to help the OP to understand the canonical definition of entropy rather than discussing higher level concepts such as information theory. In order to do this, we should stick to the definition of entropy as stated in introductory statistical mechanics books.

@Vanadium 50 has already stated that we should be using standard definitions here.
 
  • #25
Thanks everyone for their answers...

A thing comes to my mind here...

'Everything is debatable'...
Maybe this line is also debatable☺️☺️
 
  • #26
Deepak K Kapur said:
To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

If you knew the positions and velocities of each of the particles you'd have a definition of one single microstate. But there are lots of other microstates that would also be states of equilibrium. Now, look at a nonequilibrium state and you find that there are again, lots of different microstates that can form that nonequilibrium state. It turns out, though, that there are more microstates for the equilibrium state than there are for the nonequilibrium state, thus the equilibrium state is more likely, usually far more likely. That is the reason things tend towards equilibrium.

The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:
 
  • #27
Deepak K Kapur said:
Thanks everyone for their answers...

A thing comes to my mind here...

'Everything is debatable'...
Maybe this line is also debatable☺️☺️
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort
 
  • #28
Mister T said:
The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:

Exactly. Once upon a time, I used to engage in debates with creationists who used the "entropy says things should get more disordered" as an argument against evolution. Over time, I came to the point of view that this is not only wrong, but in a way, the opposite of the truth. The tendency toward increasing entropy is what drives evolution and other living processes, in a similar way that the tendency for water to run downhill can drive waterwheels to generate power.
 
  • #29
Dale said:
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort

You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
 
  • #30
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.

I don't think that rigorous ever means that. Rigorous means careful, pains-taking. Avoiding hand-waving and appeals to intuition. A rigorous derivation or proof is one where the assumptions and rules are clear, and the steps are laid out in detail so that there is pretty much no room for doubt about the conclusion (as long as you also accept the assumptions and rules).
 
  • Like
Likes BruceW
  • #31
Deepak K Kapur said:
OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?
When the ice cube has melted, at microscopic level there is more of the imbalance. Different molecules have different speeds, that kind of imbalance.

There are some micro states where the molecules have the same speed, but not many. So very very likely the melted ice never during the lifetime of the universe visits those micro states.
 
  • #32
Deepak K Kapur said:
Sometimes, rigorous tends to mean 'accepted'.

No it doesn't. And we're back in Humpty-Dumptyland.
 
  • #33
Deepak K Kapur said:
Sometimes, rigorous tends to mean 'accepted'.

So what? Do it in such a way that it's rigorous, but you don't accept it.

For example, researchers were able to present a rigorous argument that the 2nd Law is based on probabilities and the existence of atoms, but it was not accepted by many physicists. The reason being that they didn't accept the premises upon which the rigorous argument was based.

The rest of the story is not relevant to this discussion, but I'll add it in for completeness. It was subsequently established that the premises are valid and that the conclusion is valid. It then makes it harder to not accept the 2nd Law as being valid, but there are still plenty of people who don't accept it despite its rigor. Just look at the number of attempted patent submissions for devices that violate the 2nd Law.
 
  • #34
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
Do you have any intention of putting in some personal effort to actually learn this material?
 
  • #35
Deepak K Kapur said:
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.

The more precise your question, the better the answer (in science, at least). Ultimately, the idea of scientific endeavour is to build a theory that has practical implications. To do this, precise (rigorous) definitions are needed.

I'm still not 100% sure what your concerns were. Taking the lumpy cake to a more physics-y example, suppose you have a box with lots of gas atoms inside. Call the fraction of atoms on the left-hand side of the box ##f##. We expect ##f=1/2## since all the atoms are bouncing around, they are equally likely to end up on either side. If we measure ##f## again, we again expect ##f=1/2##. Your intuition seems to be that perhaps next time ##f=0.7## or ##f=0.2## or some other random fraction. This is like your lumpy cake. You are not expecting the cake to be homogeneous. But the important point is the number of microstates corresponding to each macrostate. In our example of the box of atoms, you can think of ##f## as indicating the macrostate. There are so many microstates corresponding to ##f## being very close to ##1/2## that it means ##f## will equal ##1/2## every time.

p.s. Also, what Dale said.
 
  • #36
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?

If searching for the articles I found, include this in your search: /small-systems-defy-second-law
 
  • #37
SWB123 said:
Having read several articles
Please cite the specific articles. Keep in mind that the research being done on fluctuation theorems is often portrayed incorrectly by popular media sources.
 
  • #39
SWB123 said:
If searching for the articles I found, include this in your search: /small-systems-defy-second-law
First hit with exact title from PhysicsWorld.com, the last paragraph sums it up well: "Evans and colleagues say that their discovery could be important in the design of nanomachines. They also point out that as thermodynamic systems become smaller, the probability that they will run ‘in reverse’ increases, and this could improve our understanding of how many small biological systems – such as ‘protein motors’ – work." Totally not an entropy situation, as there are outside influences...
 
  • #40
SWB123 said:
This isn't about me needing help doing a search, it was about finding out what sources you were using. As I predicted above, all three of these articles are popular science sources which are mischaracrerizing the work being done on fluctuation theorems (at least the first and third are, but I only read the beginning of the second).

This is why pop sci sources are not considered valid, particularly for easily distortable topics like fluctuation theorem work.
 
  • Like
Likes NFuller
  • #41
SWB123 said:
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?
The fluctuation theorem work is not about defying the 2nd law. It is about deriving the 2nd law as a classical limit of a deeper statistical mechanics law.

As their name indicates, they are about fluctuations, often fluctuations about an equilibrium. The question they address is, statistically, how long would you expect fluctuations away from equilibrium to last. Even these small systems do not decrease entropy for long, nor predictably.
 
  • #42
SWB123 said:
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?

If searching for the articles I found, include this in your search: /small-systems-defy-second-law
I'm also skeptical here. From what I understand, it's not that there is more output than input, its that some quantum systems appear to be able to do work without dumping any heat and therefore have a thermodynamic efficiency of unity.

Also remember that you cannot scale up most quantum systems while still keeping the quantum effects.
 
  • #43
Deepak K Kapur said:
I think 99.999999% of people on this planet only acquire stuff along the way...

It's only a miniscule minority that ever says something new...
Minuscule, not miniscule,
 
  • #44
There is also entropy in communications theory, i.e. average self-information per message or symbol. For a group of N equally probable symbols this is log2 N.
 
  • #45
Deepak K Kapur said:
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
Things can be ordered (inefficiently) using energy. You can stack things up. They may fall later, but you can again take energy and stack them. The tendency to disorder can be defeated locally using energy.

If you take a bunch of calories in, as sugar, you can convert that sugar to carbon dioxide, and use the energy to create order within your cells. The overall process, of burning sugar, breathing in oxygen and out carbon dioxide, giving off waste heat to the air ... the overall process is one where the entropy increases. But the microscopic view of the cell shows the order THERE increasing.

The Earth is not a closed system. It has an energy input (the sun). That energy is used by living organisms to create order. Living things are engines that take in energy, and (inefficiently) do work. Living things evolved because there were energetically favorable conditions.

EDIT: Another stab at it. Consider a burning piece of wood, going from an ordered state to a bunch of smoke, ash, and combustion gases. Along with a healthy dose of heat. Now suppose you put a crucible of ore above it and make a small ingot of metal and then forge a ring. You have managed to generate an improbably well ordered thing. But the entire process, including the fire, was one with increasing entropy.

The universe may be a closed system that is increasing in entropy. That does not mean that entropy is uniformly increasing at every location. The forged ring was a location that decreased in entropy, even as the overall system increased in entropy.
 
Last edited:
  • Like
Likes BruceW
  • #46
votingmachine said:
If you take a bunch of calories in, as sugar, you can convert that sugar to carbon dioxide, and use the energy to create order within your cells. The overall process, of burning sugar, breathing in oxygen and out carbon dioxide, giving off waste heat to the air ... the overall process is one where the entropy increases. But the microscopic view of the cell shows the order THERE increasing.

While I would agree that cells use energy to create, maintain, and restore some order in themselves, biological organisms also have other ways of maintaining/creating order.

For example, molecules in organisms can sort out into sub-cellular groups based on adhesion properties between themselves (things that strongly stick to each other will tend to end up in a group together).
For proteins, properties like this would be due to the details of their encoding DNA sequences, which direct the production of the proper protein amino acid sequence.
The energy immediately used in the currently living cell would be put into production of the encoded proteins (this would be like a proximate cause in biology (an immediately preceding driver)).
Something more akin to a biological ultimate cause (more closely related to the reason for its evolution), would be to consider the energy put into the evolution of the encoded protein by all of the gene's predecessors and the cells or organisms in which those sequences resided as they went through their generations of evolutionary history.

This seems to me, to be the accrued expense of the evolutionary building of (evolving of) the DNA sequences (stored information) from which these other properties are produced (using currently available energy and the cellular environment) anew in each cell, for what is probably a relatively small energetuic cost that could just as easily be used to produce a similar protein with a different amino acid sequence which could have slightly different or vastly different properties. This of course builds up over time and will simultaneously affect all the encoding sequences in the genome that are under selection in its circumstances (some traits may be selectively neutral). Its not clear to me how this energy contribution could be easily determined and assigned to a particular result. The same organisms would also be evolving all of the other sequences they have, some organisms and genes would be evolutionary dead ends, and genes may even be acquired from other species, viruses, or nowadays lab efforts.

I figure, this is a less direct use of energy to generate and maintain biological order.
It should reduce the amount of energy currently needed for to maintain order in currently existing cells.
 
  • #47
Deepak K Kapur said:
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
Think of it as a contract with the Universe -> You can be alive i.e. living beings but as payment you have to increase the amount of entropy more than the constitute parts would by themselves.
 

Similar threads

Replies
8
Views
1K
Replies
3
Views
2K
Replies
1
Views
1K
Replies
3
Views
1K
Replies
3
Views
48
Replies
26
Views
3K
Replies
2
Views
1K
Back
Top