Can anyone help me understand how superposition leads to increased computational ability? I'm pretty sure I have all the background in conventional computing covered, I started programming basic 30 years ago and jumped directly into assembly language (position independent machine-code, then completely self-contained programs), then pascal in high school to learn program structure, then on to 80386 coding center-out rendered polygon engines (without a math co-processor) and then stayed current with true multitasking and OOP in Java... What I can't seem to comprehend is how 0 or 1 or both is expected to increase computing power so dramatically. I have already contemplated optical processing where a bit (register would be a better description) can be a value from 0 to 1 at whatever frequency the photonic "register" would contain, to the degree of precision that the hardware could discern. Anyone who has a clue what I'm talking about would immediately realize the potential, I'm just not sure if the scheme is physically feasible, yet. So how would a string of Q-bit/byte/registers compare? Is the probabilistic nature of superposition expected to fulfill my concept of "0 to 1"?