As I play on my new MacBook Pro and stream millions of bits per second through a cable, I find myself wondering about the theoretical limit of serial communication. I have noted that serial communication has become preferred over parallel communication over the years. In fact, it's gotten so fast that the "bit period is shorter than the flight time " http://en.wikipedia.org/wiki/Signal_integrity see chip-to-chip signal integrity So what's the "following distance" of our bits? How far down the wire does one bit travel before the next bit follows? I approximate v = (2/3) X c Assume 1 Gb/s (2E8 m/s) X 1E9 bits/s = 0.2 m = 20 cm. So each bit is about 20 centimeters "behind" the previous bit. Not sure what that means, but it's pretty cool either way. What is the theoretical speed limit of such communication? The wikipedia article speaks of practical concerns of echoes and other things, but does not address a theoretical limit. Does physics put an upper bound on the rate this type of information transfer?