# Theoretical limit for data transfer given frequency

1. Oct 2, 2014

### Ryan_m_b

Staff Emeritus
Ages ago I read that the maximum possible bandwidth using radio waves was in the low tbps range. I've had this rattling around in my head ever since and have occasionally tried looking it up to see if it was true. I've not had much luck in that but it did prompt me to start wondering what the maximum possible amount of bps you can have given different forms of communication.

I recently stumbled across this video which filled in a lot of the questions I had. Now though I'd like to understand it a little better, specifically I've been looking around trying to see if there is a simple equation that you can plug in values for frequency and get out values for max bps. The video linked has two separate equations in it that seem to point me in the right direction but they're not quite what I want.

As obvious by my role here I'm not a physicist :) more than willing to learn but it's probably best to keep answers at a laymans level at first.

2. Oct 2, 2014

### M Quack

The relevant equation in the video is: $\Delta f \cdot \Delta t \approx 1$

To transmit a bit you need a pulse of length (in time) $\Delta t$ that is either on or off.

The data rate then is about $1/\Delta t$, so the maximum data rate in bit/second is about the same as the band width $\Delta f$ in Hertz.

For a real, working transmission line you need a bit of overhead for synchronization etc. so the practical data rate will be lower than this limit.

See also the Shannon Hartley theorem that takes signal/noise ratio into account.

http://en.wikipedia.org/wiki/Shannon–Hartley_theorem

3. Oct 2, 2014

### sophiecentaur

That is an incredible underestimate of the information capacity of a transmission system. It assumes that you are sending rectangular (top hat) shaped pulses, which is something that no telecoms system would involve.
To understand what information is being carried in a 'digital' channel, you need to accept that the signal being carried will be in the form of a variation of an analogue quantity. (e.g. a varying voltage, current, light intensity etc. etc). The actual 'information' in a noise free time varying signal, at any time, is not what would be its binary value but its analogue value - to as many significant figures as you care to use (a lot of - potentially infinite - information).
All digital channels pass through a low (or band) pass filter, which spreads each pulse out in time. This will produce inter-symbol interference where the analogue value of each symbol (bit etc) affects the analogue value of those near it in time. If you display a typical binary signal, filtered to fit a particular channel bandwidth, on a correctly synced oscilloscope, you get an 'eye pattern' (See this link and many others). As long as you sample the incoming signal at the centres of the eye pattern, you can get 'the right answer' and reconstitute your original digital data. (Yes - there may be an overhead involved, to deal with the need to synchronise your decoder, but a long enough delay can take care of that, however bad the timing is.
In the end, the bandwidth can be reduced and reduced until the noise in the channel is enough to 'close' the eye and the channel will fail. The original Shannon paper derives this from absolute scratch - basing everything on using Morse Code (iirc) and derives the Shannon Hartley equation. This says that the only limit to data rate in a given bandwidth is, in fact, the Signal to Noise Ratio. No system has been made to work very close to this limit but modern systems, using modulation methods that are matched to the nature of the likely noise characteristic of a channel, do a very good job. If errors are detected and corrected, the overhead for the extra data can well produce stunning figures for the useful channel capacity. http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-451-principles-of-digital-communication-ii-spring-2005/lecture-notes/chap4.pdf [Broken]describes methods whereby the Shannon link can be approached. (If the theory is not familiar, it can be a bit demanding as its not intuitive).

Last edited by a moderator: May 7, 2017
4. Oct 2, 2014

### Ryan_m_b

Staff Emeritus
Thanks for the replies, the reason I asked beyond the formula given in the video is that I've read previously that it isn't as simple as the frequency of the wavelength in hertz. Sophiecentaur if the answer isn't simply the frequency in bps then how else can bandwidth be calculated purely from frequency? Or is this not possible on it's own? I had a flick through the links you provided but couldn't really make heads or tails of them (though at the moment I don't have the time to read through the last link properly).

5. Oct 2, 2014

### sophiecentaur

There just is not a simple relationship between bit rate and channel bandwidth. As soon as you get beyond a binary data stream with a 'nice looking' waveform (limited to a generous bandwidth) you cannot say what you can get - except to use the Shannon formula to give an absolute limit. With a good enough signal to noise rate and the right signal processing, you can get data at a high rate from a really narrow band channel. (Arm waving and without numbers)
If you want a practical number to work with then you need to read around about the various systems and how they perform. Remember just how much you can compress digital audio signals into only a few kB/s. That is precisely the same thing at work. It has to depend upon the degree of and type of impairment that you can accept. Even the 'best' systems will let a glitch through every few years!!