Is DNA Information? A Look Into Biologists' Perspectives

  • Thread starter Thread starter pctopgs
  • Start date Start date
  • Tags Tags
    Dna Information
AI Thread Summary
The discussion centers on whether DNA can be classified as "information" and the implications of that classification. Participants argue that while DNA encodes instructions for proteins, labeling it as information does not necessarily imply it was created by a conscious mind. The term "information" is seen as ambiguous, with some suggesting DNA acts as a carrier of information rather than being information itself. The conversation also touches on the relationship between DNA and linguistic structures, with references to Zipf's law and the nature of information in both biological and physical contexts. Ultimately, the consensus leans towards recognizing DNA as a form of information that regulates biological processes without necessitating the involvement of intelligence in its origin.
pctopgs
Messages
20
Reaction score
0
I know DNA gives instructions for proteins, but is it information as in the sense that it was created by a conscious mind? What about biologists who say that DNA is information?
 
Biology news on Phys.org
pctopgs said:
I know DNA gives instructions for proteins, but is it information as in the sense that it was created by a conscious mind? What about biologists who say that DNA is information?
Information does not imply a conscious mind, so no contradiction.
 
Oh here we go. I don’t think that anyone seriously disputes that DNA is information. It does not follow that it must therefore have been created by a conscious mind.
 
Information is an incredibly ambiguous word in the sense that in it's rawest form you could say that it was "any event that effects the state of a dynamic system" (quoting from the wiki page) or you could talk in terms of language etc.

To my mind DNA is information in the same way that any constituent molecule of a chemical reaction is information. DNA is a molecule that facilitates specific chemical reactions that result in the working of a biological system. To bring in notions of code/information/language is only useful for analogy.
 
My take is that DNA is not information per se, just like hard drive is not information. They are CARRIERS of the information.
 
Borek said:
My take is that DNA is not information per se, just like hard drive is not information. They are CARRIERS of the information.

That brings on the question what is information?
 
Ok I just needed your views on this...

Does DNA follow linguistics law?...And why do biologists keep using the word information when people are going to use the ambiguity of the word for a different meaning?
 
We only use the the term information either as an analogy (i.e. if referring to it in the hard-drive sense) or to describe the coding on DNA.

What do you mean by linguistics law:confused:
 
idk Someone told me that dna is information because it follows linguistics law
 
  • #10
Well I've never heard that term at all, a quick google didn't enlighten me any further either.
 
  • #11
Im sorry I meant Zipf's law...the person says zipfs law...
 
  • #12
The point here is that the specific case of lingustic structures will exhibit the principle of information-carrying structures which are the general case.

Still, information carrying structures do not mean that any intelligence is involved.
 
  • #13
Looking up Zipf's law I still don't see the relevance. As DaveC said there is still no intelligence
 
  • #14
Genome Glossary from The Human Genome Project:

DNA (deoxyribonucleic acid)
The molecule that encodes genetic information. DNA is a double-stranded molecule held together by weak bonds between base pairs of nucleotides. The four nucleotides in DNA contain the bases adenine (A), guanine (G), cytosine (C), and thymine (T). In nature, base pairs form only between A and T and between G and C; thus the base sequence of each single strand can be deduced from that of its partner.
http://www.ornl.gov/sci/techresources/Human_Genome/glossary/glossary_d.shtml
 
  • #15
Hmmm, this has suddenly become an interesting discussion. There is a point here that is very similar to one of the common misconceptions about evolution. Usually, when two species have a similar trait, we explain that as being due to the fact that the trait existed in the common ancestor of the two species. But what about the cases where a similar trait in two species must unarguably have evolved separately because it evolved in each species after the two species diverged? The key point is to understand that the reason why the trait is similar is because the environmental pressures that led to its development were similar, so it is not so surprising that the evolved solutions were similar.

So, it is easy to see ‘information’ as something connected only with language. But in fact, if language was to serve the purpose for which it evolved, it was necessary for it to meet certain fundamental requirements. DNA, if it was to serve the purpose of being the carrier of heritable traits had to meet the same basic requirements. So it is not so surprising that there are aspects of how DNA works that seem to connect with how language works. Still doesn’t follow that it was designed by a conscious mind just because language was.
 
  • #16
Ken Natton said:
But what about the cases where a similar trait in two species must unarguably have evolved separately because it evolved in each species after the two species diverged?
Convergent evolution.

Mammals with wings, etc.
 
  • #17
There's a very broad definition of infomation that is used in physics that is basically 'any property of a system', for example, a cloud of gas falling into a black hole results in a loss of information - you cannot retrieve the configuration of the cloud to recreate it.

But there's more narrow definition about codifying sequences of events. Wiki has a defintion: "any kind of event that affects the state of a dynamic system." but I'm not sure it needs to affects a dynamic system.

What's been bugging me is that I cannot find an example of information (in the codified instructions sense) that does not involve life.
 
  • #18
Kevin Kelly discussed the nature of information in his blog recently and reported a conversation with Freeman Dyson on the subject . See http://www.kk.org/thetechnium/archives/2011/04/infinite_order.php - Infinite Order in All Directions.

There seems to be a contradiction between the observed increasing complex structures we see in the Universe in galaxies and life forms and the second law of thermodynamics which demands increasing disorder.

Hope this helps G
 
  • #19
Ok, there is matter and there is energy. But you can't describe a system by just matter and energy, because several different arrangements of matter ad energy can all have the same total matter/energy but function completely different as a function of the geometry (arrangement) of the matter and energy.

That is what information is.

So no, DNA isn't strictly information. Information is a property of DNA.
 
  • #20
Pythagorean said:
Ok, there is matter and there is energy. But you can't describe a system by just matter and energy, because several different arrangements of matter ad energy can all have the same total matter/energy but function completely different as a function of the geometry (arrangement) of the matter and energy.

That is what information is.

Bowing_Smiley.gif
 
  • #21
Going said:
Kevin Kelly discussed the nature of information in his blog recently and reported a conversation with Freeman Dyson on the subject . See http://www.kk.org/thetechnium/archives/2011/04/infinite_order.php - Infinite Order in All Directions.

There seems to be a contradiction between the observed increasing complex structures we see in the Universe in galaxies and life forms and the second law of thermodynamics which demands increasing disorder.

Hope this helps G

Increasing complexity can arise out of a system as long as global entropy increases as well.
 
  • #22
We see information (the codified form created by humans) to always come from intelligence. could we, by strong inference, say that DNA is also from an intelligence? (Regardless of whether we know the nature of the intelligence except that it is an intelligence someone similar to our own?)
 
  • #23
ryan_m_b said:
Increasing complexity can arise out of a system as long as global entropy increases as well.

Thank you for not making me point that out :wink:
 
  • #24
pctopgs said:
We see information (the codified form created by humans) to always come from intelligence.
No, we don't.
could we, by strong inference, say that DNA is also from an intelligence?
The premise is flawed, therefore the logical inference is flawed.

I think it is easy to see a strand of DNA as a sort of geometric/chemical hard drive with coded information on it (is it base-3 or base-6?), but I don't see any reason a all to think that that implies intelligence created it.
 
  • #25
DaveC426913 said:
But there's more narrow definition about codifying sequences of events. Wiki has a defintion: "any kind of event that affects the state of a dynamic system." but I'm not sure it needs to affects a dynamic system.

What's been bugging me is that I cannot find an example of information (in the codified instructions sense) that does not involve life.

This is definitely a tricky issue because "information" has two orthogonal meanings. One is about the physically countable - the bits, the microstates. The atomistic entities that would compose a more global state of order or disorder. And the other actually is about meaning - information as knowledge or regulation or constraint. And DNA embodies both these senses of the word.

It is a code, a collection of atomistic bits. And note, DNA is as far from dynamic as you can get. In the hot thermal jostle of a cell, DNA stands aloof. It remains always countable as a microstate. Like the flipped magnetic particles of a hard drive, there is no uncertainty about the encoded sequence.

Yet DNA also stands in meaningful relation to rate-dependent dynamics. As Wiki says, information is an event that affects the state of a dynamic system. Or to be more precise, the best biologically-oriented definition of information is it is "any constraint on entropy production". So this stresses the regulation of dynamics. And also makes it plain we are talking second law thermo-dynamics. Gradients of energy to be dissipated. DNA is the information that constrains the dynamics of metabolic processes.

So DNA is information however you define information (or whichever of the two orthogonal meanings you want to stress). But it is very easy to get confused about whether you are talking syntax or semantics. The physical basis of the coding mechanism or the meaning of some particular coded message.
 
  • #26
pctopgs said:
We see information (the codified form created by humans) to always come from intelligence. could we, by strong inference, say that DNA is also from an intelligence? (Regardless of whether we know the nature of the intelligence except that it is an intelligence someone similar to our own?)

Theoretical biologists would actually see it the other way round. They would generalise the particular human case to the most general physical case.

As described, for example, by Howard Pattee's epistemic cut.

http://binghamton.academia.edu/Howa...physics_of_symbols_bridging_the_epistemic_cut
 
  • #27
Let me add a little more to what I have previously posted. This is from the U.S. National Library of Medicine's Handbook updated on May 11, 2011. Here are three snippets from "What is DNA":

DNA, or deoxyribonucleic acid, is the hereditary material in humans and almost all other organisms. Nearly every cell in a person’s body has the same DNA. Most DNA is located in the cell nucleus (where it is called nuclear DNA), but a small amount of DNA can also be found in the mitochondria (where it is called mitochondrial DNA or mtDNA).

[. . .]

The information in DNA is stored as a code made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). Human DNA consists of about 3 billion bases, and more than 99 percent of those bases are the same in all people. The order, or sequence, of these bases determines the information available for building and maintaining an organism, similar to the way in which letters of the alphabet appear in a certain order to form words and sentences.

[. . .]

An important property of DNA is that it can replicate, or make copies of itself. Each strand of DNA in the double helix can serve as a pattern for duplicating the sequence of bases. This is critical when cells divide because each new cell needs to have an exact copy of the DNA present in the old cell.

[. . .]
http://ghr.nlm.nih.gov/handbook/basics/dna



DaveC426913 said:
Convergent evolution.

Mammals with wings, etc.

Thanks Dave. Reminds me of an interesting article from May 18, 2011 entitled "Errors in protein structure sparked evolution of biological complexity" - Packing flaws create "sticky," interactive proteins that spread in small populations:

Over four billion years of evolution, plants and animals grew far more complex than their single-celled ancestors. But a new comparison of proteins shared across species finds that complex organisms, including humans, have accumulated structural weaknesses that may have actually launched the long journey from microbe to man.

The study, published in Nature, suggests that the random introduction of errors into proteins, rather than traditional natural selection, may have boosted the evolution of biological complexity. Flaws in the "packing" of proteins that make them more unstable in water could have promoted protein interactions and intracellular teamwork, expanding the possibilities of life.

"Everybody wants to say that evolution is equivalent to natural selection and that things that are sophisticated and complex have been absolutely selected for," said study co-author Ariel Fernández, PhD, a visiting scholar at the University of Chicago and senior researcher at the Mathematics Institute of Argentina (IAM) in Buenos Aires. "What we are claiming here is that inefficient selection creates a niche or an opportunity to evolve complexity."
[. . .]
Please read on . . .
http://www.uchospitals.edu/news/2011/20110518-protein.html
 
Last edited:
  • #28
You could think of it in terms of the "Selfish gene" concept. Every gene that makes up your body wants to replicate itself i.e. it wants to make more copies of itself (the use of the word "want" is not in the sense of a conscious tendency to do some activity; I am using such words only to avoid a passive voice). However, only replication is not enough; ensuring the survival of the copies is also important. For this purpose, genes have come together and have made bodies which act as their protective vehicles.

These bodies are very well organized structures; every organelle in a cell and every cell in a body has its own place and function. However nature doesn't prefer such a coordinated arrangement of particles. Things always tend to operate randomly in their natural condition. Therefore in order to maintain the same state of organisation, you need information.

And that information is carried by DNA.
 
  • #29
Hey guys, I have to bring this post back because I got some news on the subject.

OK the reason I started this thread is because someone is claiming that dna is a "Language" used to convey "information" that same way that English is a language used to convey information...And those used it to say that DNA could therefore only come from a mind...which is god he concludes...

Here are some sources I finally got...

http://www.sciencedirect.com/science/article/pii/S0378437102017879

I Havnt read all these articles though sorry if I am wasting your time
 
Last edited by a moderator:
  • #30
pctopgs said:
Hey guys, I have to bring this post back because I got some news on the subject.

OK the reason I started this thread is because someone is claiming that dna is a "Language" used to convey "information" that same way that English is a language used to convey information...And those used it to say that DNA could therefore only come from a mind...which is god he concludes...

The problem with the argument you purport your friend makes is that it is circular.

As his premise, he defines both information and language as "something that must come from an intelligent mind", thus when he finds information stored in the language of DNA, he concludes that it must be from an intelligent mind.

The question is: what makes him conclude that information and language must come from an intelligent mind?

This will set him aback, and he will say 'it is self-evident', which is not an answer.
 
  • #31
As a separate issue, I propose this:

The one thing that is essential for the creation of language and information is life. I cannot think of any examples of language or information that do not have life at their foundation.

So, with those two statements I've formed a self-consistent 1:1 relationship between life and information/langauge. The existence or absence of God is unnecessary to draw the above conclusions.
 
  • #32
I agree with Dave.

There's nothing so special about information except that we haven't thought about it much until recently, so it's mysterious and neat (to some).

Other than that, information just is. There's no reason why it should have a) come from an intelligent mind or b) appeared out of thin air. DNA does store and convey information, yes, but so do charged particle configurations. There's no intelligence required.

It would be neat to some day, be able to talk about how information was generated the same way we talk about how mass was generated in the early universe. Was information generated... or was it fundamental?

Obviously, at the time of the singularity, information must have existed. Perhaps at the end of the universe, there will be no information. Everything will just be uniform: 100% entropy. All we can do is speculate until we understand the nature of information better.
 
  • #33
ryan_m_b said:
Information is an incredibly ambiguous word in the sense that in it's rawest form you could say that it was "any event that effects the state of a dynamic system" (quoting from the wiki page) or you could talk in terms of language etc.

To my mind DNA is information in the same way that any constituent molecule of a chemical reaction is information. DNA is a molecule that facilitates specific chemical reactions that result in the working of a biological system. To bring in notions of code/information/language is only useful for analogy.

The concept of information has actually been quite well defined since Claude Shannon published his work on information theory in 1947. The Shannon information of an event (such as a particular sequence of base pairs) E can be expressed:

I(E)=-c (log_2 (P(E)))

where P(E) is the probability of E and c is a positive constant.

There is a close relation between entropy and information as formal concepts. Entropy is a function of the number possible states of a system and information is a function of the realization of a particular state of that system. If a system can have many possible states, then the realization of a particular state will have a high value in information measure. With log base 2 the measure is 'bits'. A system that can exist in only one state will have zero bits of information with the realization of that state.
 
Last edited:
  • #34
Pythagorean said:
Other than that, information just is. There's no reason why it should have a) come from an intelligent mind or b) appeared out of thin air. DNA does store and convey information, yes, but so do charged particle configurations. There's no intelligence required.

Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.
 
  • #35
DaveC426913 said:
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.

So as I referenced in post #26, why not employ the concepts that have been developed by biologists like Howard Pattee?

Following in the footsteps of Shannon, information has become reified as a universal substance and so robbed of its original sense of "carrying meaning".

This created a useful general tool of scientific measurement but has led to huge ontological confusion as well.

Thus we find these semi-religious debates about "where is the meaning in information?". Well, that is where you have to turn to supplementary theories such as Pattee's epistemic cut so as to understand in what precise sense DNA is now "information" as well as information.
 
  • #36
Perhaps this is tangental to where the discussion has gone, but I've just discovered the thread...

Douglas Hofstadter, in Godel, Escher, Bach, questions whether the instructions to make a given set of proteins can really be said to be stored in DNA. His discussion is extensive (being woven through several chapters), but it comes down to asking how much of that information is in the base sequences in the DNA and how much is added by the chemical context in the cell? By analogy, he asks how much of the meaning of a text (like the Rosetta stone) is in the characters and how much is in the context provided by the culture? Or how much of the emotional response to a song is in the melody or chord sequence and how much is added by the listener? How much of the meaning of a John Cage composition is carried by the context of the tapestry of western music?

He also refers to Schrodinger's observation that humans expect to find meaning in aperiodic crystals: things that show regular form on level level but not on another. A book (being a bound, orderly stack of rectangular pages with markings on the pages that are draw from a small set of characters) has a regular form. The order of of the markings, however, is not obviously patterned until a decoding mechanism is found.

It's a very engaging discussion, which I probably do not do justice to in my summary.
 
  • #37
SW VandeCarr said:
The concept of information has actually been quite well defined since Claude Shannon published his work on information theory in 1947. The Shannon information of an event (such as a particular sequence of base pairs) E can be expressed:

I(E)=-c (log_2 (P(E)))

where P(E) is the probability of E and c is a positive constant.

There is a close relation between entropy and information as formal concepts. Entropy is a function of the number possible states of a system and information is a function of the realization of a particular state of that system. If a system can have many possible states, then the realization of a particular state will have a high value in information measure. With log base 2 the measure is 'bits'. A system that can exist in only one state will have zero bits of information with the realization of that state.

All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.
 
  • #38
ryan_m_b said:
All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.

Well think about how we represent any kind of knowledge, or even the neural pathways by which sense data is processed, stored and acted upon. Langauge is not a random collection of sounds or symbols. Given all possible configurations of a string of length n using k symbols including spaces (its entropy), only a small fraction of these configurations carry information in any human language, and a still smaller fraction in a language you might understand. Of all possible visual patterns that could be processed by the human retina, only small fraction are patterns that make sense based on our training and expectations.

In pharma R&D there are sophisticated programs for visualizing the conformal dynamics of proteins and other activities involving receptor sites and ion channels Obviously these programs are nothing more or less than pure information which can be measured in bits (or nits or dits if you prefer, but I'll stick with bits. The other two use the natural log base or base 10).
 
Last edited:
  • #39
DaveC426913 said:
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.

Ah, I hadn't noticed your other post. I think the general form of information doesn't require life, but yes I agree that life "makes use" of information in a sophisticated way, social creatures even more so, and humans (in my bias opinion) make the most of it and even generate it solely for the purpose of generating information so that others can receive it (i.e art). Notice, I'm still not talking about meaning though, only the transmission and reception of visual information (in the case of art).

Meaning hinges on semantic information, which is when a system has a very, very large memory capacity and begins to classify particular information structures as a "type" (so you have a word for "apple", a semantic designation that comes from several exposures to information that looks similar: that conveying an apple). The redundancy and compression come in. In physics, we start with 100's of equations, shoehorn them into 4 equations (Maxwell's equations) then pressure cook them into a last, final equation using the d'Alembertian. So there's not really a lot of information in that last, final equation. All the information is really in the brain. The last final equation is more-or-less a title which a human can use to unpack (from their own brain) the deeper meanings through derivations (another algorithm, another "blue print", but a procedural one.)

But ultimately, meaning is simply compressed information (semantics or semiotics) that represents a larger set of information (your episodic exposure to the concept). The word itself contains very little information. The neuroethological complex that the word sets of in a human brain is really where the bulk of the information has been stored through iterative exposure and association.

But "blue prints" are essentially a map of the general information. The map tells you the geometry: what goes where. In the universe then, a Newtonian/euclidian map would tell you the position and momentum of every particle, along with a matrix of interactions between all possible pairs of particles (i.e. forces).
 
Back
Top