- #1
- 1,481
- 964
Not sure whether this is the right category but bear with me.
I've seen graphics where the information increase with time of evolution is projected like the one in the link below from Carl Sagan's book
https://www.researchgate.net/figure/Page-from-the-book-of-Carl-Sagan-21_fig2_322153131
Now I do understand this is a projection but apart from being a projection is there any real proof of this information increase ?
I understand that it is taken as proof when looked upon from the fact that humans for example have more neurons than chimpanzees.
But if we take any single species like humans or other mammals as an example, do we then have any evidence that within certain species the amount of information has increased with each following generation , like for example a human living 50 000 years ago might have X amount of total equivalent of information and we now have more?
I am asking this partly because unlike computers where we can interact and measure the amount of information stored in binary form in humans we could only theoretically measure the amount of heat produced by added information say in the lifetime of a human but as far as I know this heat is so small due to the efficiency of the brain that it is completely lost on the background heat of the organism so I suppose there is no way of determining any change in entropy of the organism when it is learning versus when it is not.
I've seen graphics where the information increase with time of evolution is projected like the one in the link below from Carl Sagan's book
https://www.researchgate.net/figure/Page-from-the-book-of-Carl-Sagan-21_fig2_322153131
Now I do understand this is a projection but apart from being a projection is there any real proof of this information increase ?
I understand that it is taken as proof when looked upon from the fact that humans for example have more neurons than chimpanzees.
But if we take any single species like humans or other mammals as an example, do we then have any evidence that within certain species the amount of information has increased with each following generation , like for example a human living 50 000 years ago might have X amount of total equivalent of information and we now have more?
I am asking this partly because unlike computers where we can interact and measure the amount of information stored in binary form in humans we could only theoretically measure the amount of heat produced by added information say in the lifetime of a human but as far as I know this heat is so small due to the efficiency of the brain that it is completely lost on the background heat of the organism so I suppose there is no way of determining any change in entropy of the organism when it is learning versus when it is not.
Last edited by a moderator: