## Theoretical limit of serial communication

As I play on my new MacBook Pro and stream millions of bits per second through a cable, I find myself wondering about the theoretical limit of serial communication. I have noted that serial communication has become preferred over parallel communication over the years.

In fact, it's gotten so fast that the "bit period is shorter than the flight time "

http://en.wikipedia.org/wiki/Signal_integrity
see chip-to-chip signal integrity

So what's the "following distance" of our bits? How far down the wire does one bit travel before the next bit follows?

I approximate v = (2/3) X c
Assume 1 Gb/s

(2E8 m/s) X 1E9 bits/s = 0.2 m = 20 cm.

So each bit is about 20 centimeters "behind" the previous bit. Not sure what that means, but it's pretty cool either way.

What is the theoretical speed limit of such communication? The wikipedia article speaks of practical concerns of echoes and other things, but does not address a theoretical limit. Does physics put an upper bound on the rate this type of information transfer?

 PhysOrg.com science news on PhysOrg.com >> Heat-related deaths in Manhattan projected to rise>> Dire outlook despite global warming 'pause': study>> Sea level influenced tropical climate during the last ice age
 Recognitions: Science Advisor I don't understand what your post is about. Bit periods have been shorter than the flight time since the beginning of electrical communication. Even telegraph dots could be shorter than the time of flight when sent across the intercontinental telegraph lines in the 1800's. As for a theoretical limit, Shannon's channel theorem expresses an absolute theoretical limit on the rate of data that can be transferred across a noisy channel.
 Blog Entries: 1 Recognitions: Science Advisor You might be interested in transmission line theory: http://en.wikipedia.org/wiki/Transmi...#Applicability For the most part (certainly, for most of the work I do as a non-RF electronics engineer), the signal frequency is low enough and the distance it has to travel is short enough that transmission line effects don't come into play. To quote from the Wikipedia article I linked to, "[T]he length of the wires connecting the components can for the most part be ignored." The rule of thumb is that once the length of the conductors exceeds about 10% of the wavelength of the signal (~2/3*speed of light / frequency), you have to account for the transmission line effects and properly terminate your conductors, or risk all manner of electrical nastiness (radiating away the signal power, signal reflections, etc.)