Is DNA information?

  • Thread starter pctopgs
  • Start date
  • Tags
    dna
  • #26
apeiron
Gold Member
2,013
1
We see information (the codified form created by humans) to always come from intelligence. could we, by strong inference, say that DNA is also from an intelligence? (Regardless of whether we know the nature of the intelligence except that it is an intelligence someone similar to our own?)
Theoretical biologists would actually see it the other way round. They would generalise the particular human case to the most general physical case.

As described, for example, by Howard Pattee's epistemic cut.

http://binghamton.academia.edu/Howa...physics_of_symbols_bridging_the_epistemic_cut
 
  • #27
418
0
Let me add a little more to what I have previously posted. This is from the U.S. National Library of Medicine's Handbook updated on May 11, 2011. Here are three snippets from "What is DNA":

DNA, or deoxyribonucleic acid, is the hereditary material in humans and almost all other organisms. Nearly every cell in a person’s body has the same DNA. Most DNA is located in the cell nucleus (where it is called nuclear DNA), but a small amount of DNA can also be found in the mitochondria (where it is called mitochondrial DNA or mtDNA).

[. . .]

The information in DNA is stored as a code made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). Human DNA consists of about 3 billion bases, and more than 99 percent of those bases are the same in all people. The order, or sequence, of these bases determines the information available for building and maintaining an organism, similar to the way in which letters of the alphabet appear in a certain order to form words and sentences.

[. . .]

An important property of DNA is that it can replicate, or make copies of itself. Each strand of DNA in the double helix can serve as a pattern for duplicating the sequence of bases. This is critical when cells divide because each new cell needs to have an exact copy of the DNA present in the old cell.

[. . .]
http://ghr.nlm.nih.gov/handbook/basics/dna


Convergent evolution.

Mammals with wings, etc.
Thanks Dave. Reminds me of an interesting article from May 18, 2011 entitled "Errors in protein structure sparked evolution of biological complexity" - Packing flaws create "sticky," interactive proteins that spread in small populations:

Over four billion years of evolution, plants and animals grew far more complex than their single-celled ancestors. But a new comparison of proteins shared across species finds that complex organisms, including humans, have accumulated structural weaknesses that may have actually launched the long journey from microbe to man.

The study, published in Nature, suggests that the random introduction of errors into proteins, rather than traditional natural selection, may have boosted the evolution of biological complexity. Flaws in the "packing" of proteins that make them more unstable in water could have promoted protein interactions and intracellular teamwork, expanding the possibilities of life.

"Everybody wants to say that evolution is equivalent to natural selection and that things that are sophisticated and complex have been absolutely selected for," said study co-author Ariel Fernández, PhD, a visiting scholar at the University of Chicago and senior researcher at the Mathematics Institute of Argentina (IAM) in Buenos Aires. "What we are claiming here is that inefficient selection creates a niche or an opportunity to evolve complexity."
[. . .]
Please read on . . .
http://www.uchospitals.edu/news/2011/20110518-protein.html
 
Last edited:
  • #28
You could think of it in terms of the "Selfish gene" concept. Every gene that makes up your body wants to replicate itself i.e. it wants to make more copies of itself (the use of the word "want" is not in the sense of a conscious tendency to do some activity; I am using such words only to avoid a passive voice). However, only replication is not enough; ensuring the survival of the copies is also important. For this purpose, genes have come together and have made bodies which act as their protective vehicles.

These bodies are very well organized structures; every organelle in a cell and every cell in a body has its own place and function. However nature doesn't prefer such a coordinated arrangement of particles. Things always tend to operate randomly in their natural condition. Therefore in order to maintain the same state of organisation, you need information.

And that information is carried by DNA.
 
  • #29
19
0
Hey guys, I have to bring this post back because I got some news on the subject.

OK the reason I started this thread is because someone is claiming that dna is a "Language" used to convey "information" that same way that English is a language used to convey information....And those used it to say that DNA could therefore only come from a mind...which is god he concludes....

Here are some sources I finally got....

http://www.sciencedirect.com/science/article/pii/S0378437102017879

I Havnt read all these articles though sorry if Im wasting your time
 
Last edited by a moderator:
  • #30
DaveC426913
Gold Member
18,934
2,426
Hey guys, I have to bring this post back because I got some news on the subject.

OK the reason I started this thread is because someone is claiming that dna is a "Language" used to convey "information" that same way that English is a language used to convey information....And those used it to say that DNA could therefore only come from a mind...which is god he concludes....
The problem with the argument you purport your friend makes is that it is circular.

As his premise, he defines both information and language as "something that must come from an intelligent mind", thus when he finds information stored in the langauge of DNA, he concludes that it must be from an intelligent mind.

The question is: what makes him conclude that information and language must come from an intelligent mind?

This will set him aback, and he will say 'it is self-evident', which is not an answer.
 
  • #31
DaveC426913
Gold Member
18,934
2,426
As a separate issue, I propose this:

The one thing that is essential for the creation of language and information is life. I cannot think of any examples of language or information that do not have life at their foundation.

So, with those two statements I've formed a self-consistent 1:1 relationship between life and information/langauge. The existence or absence of God is unnecessary to draw the above conclusions.
 
  • #32
Pythagorean
Gold Member
4,210
270
I agree with Dave.

There's nothing so special about information except that we haven't thought about it much until recently, so it's mysterious and neat (to some).

Other than that, information just is. There's no reason why it should have a) come from an intelligent mind or b) appeared out of thin air. DNA does store and convey information, yes, but so do charged particle configurations. There's no intelligence required.

It would be neat to some day, be able to talk about how information was generated the same way we talk about how mass was generated in the early universe. Was information generated... or was it fundamental?

Obviously, at the time of the singularity, information must have existed. Perhaps at the end of the universe, there will be no information. Everything will just be uniform: 100% entropy. All we can do is speculate until we understand the nature of information better.
 
  • #33
2,123
79
Information is an incredibly ambiguous word in the sense that in it's rawest form you could say that it was "any event that effects the state of a dynamic system" (quoting from the wiki page) or you could talk in terms of language etc.

To my mind DNA is information in the same way that any constituent molecule of a chemical reaction is information. DNA is a molecule that facilitates specific chemical reactions that result in the working of a biological system. To bring in notions of code/information/language is only useful for analogy.
The concept of information has actually been quite well defined since Claude Shannon published his work on information theory in 1947. The Shannon information of an event (such as a particular sequence of base pairs) E can be expressed:

[tex]I(E)=-c (log_2 (P(E)))[/tex]

where P(E) is the probability of E and c is a positive constant.

There is a close relation between entropy and information as formal concepts. Entropy is a function of the number possible states of a system and information is a function of the realization of a particular state of that system. If a system can have many possible states, then the realization of a particular state will have a high value in information measure. With log base 2 the measure is 'bits'. A system that can exist in only one state will have zero bits of information with the realization of that state.
 
Last edited:
  • #34
DaveC426913
Gold Member
18,934
2,426
Other than that, information just is. There's no reason why it should have a) come from an intelligent mind or b) appeared out of thin air. DNA does store and convey information, yes, but so do charged particle configurations. There's no intelligence required.
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.
 
  • #35
apeiron
Gold Member
2,013
1
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.
So as I referenced in post #26, why not employ the concepts that have been developed by biologists like Howard Pattee?

Following in the footsteps of Shannon, information has become reified as a universal substance and so robbed of its original sense of "carrying meaning".

This created a useful general tool of scientific measurement but has led to huge ontological confusion as well.

Thus we find these semi-religious debates about "where is the meaning in information?". Well, that is where you have to turn to supplementary theories such as Pattee's epistemic cut so as to understand in what precise sense DNA is now "information" as well as information.
 
  • #36
403
36
Perhaps this is tangental to where the discussion has gone, but I've just discovered the thread...

Douglas Hofstadter, in Godel, Escher, Bach, questions whether the instructions to make a given set of proteins can really be said to be stored in DNA. His discussion is extensive (being woven through several chapters), but it comes down to asking how much of that information is in the base sequences in the DNA and how much is added by the chemical context in the cell? By analogy, he asks how much of the meaning of a text (like the Rosetta stone) is in the characters and how much is in the context provided by the culture? Or how much of the emotional response to a song is in the melody or chord sequence and how much is added by the listener? How much of the meaning of a John Cage composition is carried by the context of the tapestry of western music?

He also refers to Schrodinger's observation that humans expect to find meaning in aperiodic crystals: things that show regular form on level level but not on another. A book (being a bound, orderly stack of rectangular pages with markings on the pages that are draw from a small set of characters) has a regular form. The order of of the markings, however, is not obviously patterned until a decoding mechanism is found.

It's a very engaging discussion, which I probably do not do justice to in my summary.
 
  • #37
Ryan_m_b
Staff Emeritus
Science Advisor
5,844
711
The concept of information has actually been quite well defined since Claude Shannon published his work on information theory in 1947. The Shannon information of an event (such as a particular sequence of base pairs) E can be expressed:

[tex]I(E)=-c (log_2 (P(E)))[/tex]

where P(E) is the probability of E and c is a positive constant.

There is a close relation between entropy and information as formal concepts. Entropy is a function of the number possible states of a system and information is a function of the realization of a particular state of that system. If a system can have many possible states, then the realization of a particular state will have a high value in information measure. With log base 2 the measure is 'bits'. A system that can exist in only one state will have zero bits of information with the realization of that state.
All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.
 
  • #38
2,123
79
All very true, when I said information is ambiguous I meant the colloquial use. Some will use it as "language", some geneticists might say that most of the DNA is not information if they are referring to non-coding sequences etc etc.
Well think about how we represent any kind of knowledge, or even the neural pathways by which sense data is processed, stored and acted upon. Language is not a random collection of sounds or symbols. Given all possible configurations of a string of length n using k symbols including spaces (its entropy), only a small fraction of these configurations carry information in any human language, and a still smaller fraction in a language you might understand. Of all possible visual patterns that could be processed by the human retina, only small fraction are patterns that make sense based on our training and expectations.

In pharma R&D there are sophisticated programs for visualizing the conformal dynamics of proteins and other activities involving receptor sites and ion channels Obviously these programs are nothing more or less than pure information which can be measured in bits (or nits or dits if you prefer, but I'll stick with bits. The other two use the natural log base or base 10).
 
Last edited:
  • #39
Pythagorean
Gold Member
4,210
270
Thing is, I am finding a concept here that identifies a particular kind of information. I just haven't figured out how to define it yet.

The closest I can come is 'blueprints'.

There's all sorts of information out there in the universe but only a tiny subset of it is used the way life uses it - as a language that follows rules to build stuff. I can't think of a single non-life example where configurations are stored in an abstract form, then "read" to make something.
Ah, I hadn't noticed your other post. I think the general form of information doesn't require life, but yes I agree that life "makes use" of information in a sophisticated way, social creatures even more so, and humans (in my bias opinion) make the most of it and even generate it solely for the purpose of generating information so that others can recieve it (i.e art). Notice, I'm still not talking about meaning though, only the transmission and reception of visual information (in the case of art).

Meaning hinges on semantic information, which is when a system has a very, very large memory capacity and begins to classify particular information structures as a "type" (so you have a word for "apple", a semantic designation that comes from several exposures to information that looks similar: that conveying an apple). The redundancy and compression come in. In physics, we start with 100's of equations, shoehorn them into 4 equations (Maxwell's equations) then pressure cook them into a last, final equation using the d'Alembertian. So there's not really a lot of information in that last, final equation. All the information is really in the brain. The last final equation is more-or-less a title which a human can use to unpack (from their own brain) the deeper meanings through derivations (another algorithm, another "blue print", but a procedural one.)

But ultimately, meaning is simply compressed information (semantics or semiotics) that represents a larger set of information (your episodic exposure to the concept). The word itself contains very little information. The neuroethological complex that the word sets of in a human brain is really where the bulk of the information has been stored through iterative exposure and association.

But "blue prints" are essentially a map of the general information. The map tells you the geometry: what goes where. In the universe then, a newtonian/euclidian map would tell you the position and momentum of every particle, along with a matrix of interactions between all possible pairs of particles (i.e. forces).
 

Related Threads on Is DNA information?

  • Last Post
Replies
4
Views
4K
  • Last Post
Replies
10
Views
2K
  • Last Post
Replies
3
Views
4K
Replies
3
Views
6K
  • Last Post
Replies
5
Views
4K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
10
Views
4K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
3
Views
2K
Replies
1
Views
1K
Top