When will e-bits overtake bio-bits?

  • Thread starter Thread starter Loren Booda
  • Start date Start date
AI Thread Summary
The discussion centers on the comparison between the information capacity of computers and biological systems, particularly in terms of DNA and neural signals. It highlights that while computers have surpassed biological systems in information density, the complexity of biological information remains significant. Biological information is continuous, whereas computer data is discrete, complicating direct comparisons. The conversation touches on the evolving capabilities of computers, including their strategic thinking and potential to replicate human creativity, though there is skepticism about whether machines can truly achieve consciousness or replicate the nuances of human creativity. Additionally, the challenges of simulating human cognitive processes in computers are noted, emphasizing that current technology may not yet be capable of fully mimicking human visual processing.
Loren Booda
Messages
3,108
Reaction score
4
When will the capacity of computers surpass characteristic biological information?
 
Computer science news on Phys.org
I cannot decipher this question.

- Warren
 
Loren, what biological information are you referring to?
 
"Electronic" bits (e-bits) constitute the information capacity of artificial analog, binary and generic-quantum signals, memories and processors. Biological bits (bio-bits) constitute life's comprehensive DNA coding, neural signals, natural intercommunication, memories (including existing physical adaptation to environment?) and processor (e. g., brain) information capacity. I am trying to estimate when, if not already, the former will exceed the latter.
 
In terms of information density, computers exceeded biological systems a long time ago.

- Warren
 
Biological information is continuous, while computer data is discrete, so we cannot compare them!
 
Considering the overall magnitude (not necessarily density) of discrete, "binary" information (biological genetic code vs electronic bits), could someone calculate a rough comparison between the two?
 
Because there are four nucleotides, each of them is the equivalent of 2 bits! Computers nowadays store information in hard disks. If you want to take high-end PCs as reference, take about 200 GB for the hard disk (200 billion bits). Then consider the question of whether there are more or less nucleotides in the nucleus DNA than the magnitude of 100 billion.
 
200 GB is 1024^3*200 bits...
 
  • #10
chroot said:
I cannot decipher this question.

- Warren

That's funny, it doesn't appear to be written in cipher :-p

This question involves a lot of controversy and opinion. Chroot was right in saying that computers exceeded humanity in terms of information density, and now they are even beginning to exceed humanity in terms of strategy. Still, there are people (like me) who like to believe that human creativity is something that cannot be replicated accurately in a machine or "artificial life form".
 
  • #11
Asphyxi8 said:
This question involves a lot of controversy and opinion. Chroot was right in saying that computers exceeded humanity in terms of information density,

I doubt that assertion. The DNA alone packs all the genetic information of a human being in a cell nucleus.

Asphyxi8 said:
and now they are even beginning to exceed humanity in terms of strategy.

It depends what you call 'strategy'. Humans are usually more flexible in adapting high level strategies.
And what's important, computers have not reached the level of consciousness yet.

Asphyxi8 said:
Still, there are people (like me) who like to believe that human creativity is something that cannot be replicated accurately in a machine or "artificial life form".

Artificial Intelligence is trying to do that and is achieving amazing results.
 
  • #12
chroot said:
In terms of information density, computers exceeded biological systems a long time ago.

- Warren

Depends. The amount of memory it would take to accurately do a 1-on-1 simulation of 1 (one) human's visual processing would still be too much for anyone given super-computer... storage is not identical to direct processing.

Saw the math done somewhere, and can remember being somewhat taken aback by the amazingly 'small' amounts of memory even modern systems can address.

(for clarification: that's mostly due to the operating systems, of course)
 
Last edited:
Back
Top