# DNA computation

1. Dec 22, 2008

### cam875

I have a problem with DNA computation that is currently being researched. Currently all the research is being done because cells are so parallel but how is that even useful. I keep thinking if you have this so called cell computer calculate every possible addition problem possible for a 32 bit number than what is the point your going to have to still have enough memory to store all the answers and sort through them. I mean its the same with all this quantum computation stuff. Your just going to end up with millions of different answers which you have to sort. Maybe im just not understanding how cells are parallel and efficient for computation so can someone please enlighten me. Thanks in advance.

2. Dec 22, 2008

### CRGreathouse

In a massively parallel system (like DNA computing), the computing nodes themselves are supposed to do the 'sorting'. Suppose each node is checking if its own number divides a given large number. You could then have each node query three neighbors "do you have the answer, or have you heard about the answer?", then repeat a number of times until you can query just a few and find out if the answer is found. You can narrow it down with similar techniques.

3. Dec 22, 2008

### cam875

but what if you wanted to add 10110101 to 100001010011, isn't that where current dna computation efforts are behind on. And even if dna can sort itself and all that, you still have to ravage through the results which is where it lacks also according to the articles I have read it took them along time to ravage through the results of the hamiltonian path problem.

4. Dec 22, 2008

### CRGreathouse

If you wanted to add two (large) binary numbers together there are parallelizable algorithms that don't require as much carrying as the standard 'grade school' algorithm. I don't know how these could be adopted for DNA computation, though.

On the computational power of DNA has a good overview of DNA computing as of last decade. The chart on p. 3 is especially helpful.

5. Dec 22, 2008

### cam875

ok but it does seem strange, because it would seem like there is only one way to really add two numbers together and that involves a carry bit and everything just like todays full electronic binary adders do. But thanks for the link there.

6. Dec 23, 2008

### CRGreathouse

Al Aho and Jeff Ullman, Foundations of Computer Science ch. 13, pp. 716-721.

One way to reduce the number of carry bits for *multiplying* would be the Wallace tree multiplier:
Chris S. Wallace, "A Suggestion for a Fast Multiplier", IEEE Transactions on Electronic Computers EC-13:1 (1964), pp. 14-17.

These require O(log n) carry propagation for n-bit numbers, compared to O(n) for a the usual method.

Last edited: Dec 23, 2008
7. Dec 23, 2008

### cam875

but than it involves you having to multiply, does it not? which is just adding multiple times in a sense or am I confused?

8. Dec 23, 2008

### Cincinnatus

I'm not in this field... but I hear people talk about DNA computing from time to time. I've always wondered, what's the point? There are plenty of other massively parallel architectures out there which sound much easier to implement. What's the advantage of DNA computing? Or is this something we're doing "just because it's cool"?

9. Dec 27, 2008

### KAckermann

DNA computing is just a curiosity. It suffers from a fundamental problem, and that is of gain. Conventional transistors have gain, which allows the signal to be lifted out of the noise.

No such luck with DNA. They are probably using a roomfull of equipment to read the DNA.

10. Dec 27, 2008

### CRGreathouse

11. Dec 28, 2008

### kingdomof

Smart drug delivery systems, apparently, seem to be the goal of the research.

12. Dec 28, 2008

### Cincinnatus

ah, of course, that makes sense. Thanks!