0xDEADBEEF said:
I don't think so. The way it looks today, is that another persons mind will end up in a density matrix state were it believes A or B with a certain probability, and we'll have the small field of quantum conscience where we see how algorithms or minds or whatever can hold entangled states and superpositions and make use of them. It doesn't look too useful right now, so I don't think there is a need for quantum computation as a discipline outside of computer science. But to paraphrase von Neuman "These predictions always look silly in retrospective."
The only question that is really crazy is: If you look into your brain, and observe the states, will it look like your brain is the only one that doesn't do superpositions?
I don't think it's wise to blend 'digital' states with analogue.
Your consciousness works with an awful lot of information every 'instant' sorting out what is important for you.
It does not work at a 'bit state'/'rate'.
The brain is not 'digital' and it's not 'bit by bit' serial..
Even when you read about 'quadra core' CPU:s
It's still serial signal processing inside it and out on the buss(es).
The brain is more like analogue.
Containing a high ratio of simultaneous 'noise' aka information.
And the brain sorts that out.
At all times, although adapting to your 'needs' like sleeping hunting etc.
This site gives a good example on it when it tries to guess/translate our eyes visual information to 'Digital'.
http://www.wdv.com/Eye/EyeBandwidth/
Reading Mister Warren you might find him confusing at times:)
I did too. So I tried to translate his equations into pixels here.
1 Petabytes = 1000 terabytes where every terabyte represents 1000 gigabytes
So 4000 Terabytes times 1000 = 4 000000 gigabytes, where 1 gigabyte is 1024 Megabytes
So counted that way we have four millions times 1024 = 4096000000 mega bytes
4096000000 megabytes times 1024 is 4194304000000 kilo bytes
And (as one Kilobyte = 1024 bytes) then 4194304000000 times 1024 will give us 4294967296000000 bytes
And it takes approximately 3 bytes to characterize each pixel if I got it right.
That as every pixel is made up of R,G,and B channels and requires one byte for each channel.
("Therefore, one pixel is 3 bytes, 1 megapixel is 3 megabytes etc."
But that is not really correct as one megabyte is 1024 times 1024 bytes = 1048576 bytes (times eight bits) digitaly.
And one megapixel is 1000000 pixels so one million pixels should then be three million bytes.
That translates to 145728 bytes less than three megabytes.
If one had said megabit instead of megabyte it would be correct though.
That as one megabit is 1 000 000 bits)
So when we split 4294967296000000 bytes in three it will give us 1431655 765 333333,3333333333333333 pixels for a two hour movie.
And that I won't even try to write out in letters.
If I got it right this time?
and two hours is 7200 seconds split with 1431655 765 333333,3333333333333333 give 198841078518,51851851851851851852 pixels per second.