Theoretical limit for data transfer given frequency

In summary, the video discusses the maximum possible amount of bits that can be transmitted on a digital channel, assuming that there is no noise. The maximum possible amount of bits is equal to the band width of the channel divided by the signal to noise ratio.
  • #1
Ryan_m_b
Staff Emeritus
Science Advisor
5,963
721
Ages ago I read that the maximum possible bandwidth using radio waves was in the low tbps range. I've had this rattling around in my head ever since and have occasionally tried looking it up to see if it was true. I've not had much luck in that but it did prompt me to start wondering what the maximum possible amount of bps you can have given different forms of communication.

I recently stumbled across this video which filled in a lot of the questions I had. Now though I'd like to understand it a little better, specifically I've been looking around trying to see if there is a simple equation that you can plug in values for frequency and get out values for max bps. The video linked has two separate equations in it that seem to point me in the right direction but they're not quite what I want.

As obvious by my role here I'm not a physicist :) more than willing to learn but it's probably best to keep answers at a laymans level at first.
 
Physics news on Phys.org
  • #2
The relevant equation in the video is: [itex]\Delta f \cdot \Delta t \approx 1[/itex]

To transmit a bit you need a pulse of length (in time) [itex]\Delta t[/itex] that is either on or off.

The data rate then is about [itex]1/\Delta t[/itex], so the maximum data rate in bit/second is about the same as the band width [itex]\Delta f[/itex] in Hertz.

For a real, working transmission line you need a bit of overhead for synchronization etc. so the practical data rate will be lower than this limit.

See also the Shannon Hartley theorem that takes signal/noise ratio into account.

http://en.wikipedia.org/wiki/Shannon–Hartley_theorem
 
  • #3
M Quack said:
The relevant equation in the video is: [itex]\Delta f \cdot \Delta t \approx 1[/itex]

To transmit a bit you need a pulse of length (in time) [itex]\Delta t[/itex] that is either on or off.

The data rate then is about [itex]1/\Delta t[/itex], so the maximum data rate in bit/second is about the same as the band width [itex]\Delta f[/itex] in Hertz.

For a real, working transmission line you need a bit of overhead for synchronization etc. so the practical data rate will be lower than this limit.

See also the Shannon Hartley theorem that takes signal/noise ratio into account.

http://en.wikipedia.org/wiki/Shannon–Hartley_theorem

That is an incredible underestimate of the information capacity of a transmission system. It assumes that you are sending rectangular (top hat) shaped pulses, which is something that no telecoms system would involve.
To understand what information is being carried in a 'digital' channel, you need to accept that the signal being carried will be in the form of a variation of an analogue quantity. (e.g. a varying voltage, current, light intensity etc. etc). The actual 'information' in a noise free time varying signal, at any time, is not what would be its binary value but its analogue value - to as many significant figures as you care to use (a lot of - potentially infinite - information).
All digital channels pass through a low (or band) pass filter, which spreads each pulse out in time. This will produce inter-symbol interference where the analogue value of each symbol (bit etc) affects the analogue value of those near it in time. If you display a typical binary signal, filtered to fit a particular channel bandwidth, on a correctly synced oscilloscope, you get an 'eye pattern' (See this link and many others). As long as you sample the incoming signal at the centres of the eye pattern, you can get 'the right answer' and reconstitute your original digital data. (Yes - there may be an overhead involved, to deal with the need to synchronise your decoder, but a long enough delay can take care of that, however bad the timing is.
In the end, the bandwidth can be reduced and reduced until the noise in the channel is enough to 'close' the eye and the channel will fail. The original Shannon paper derives this from absolute scratch - basing everything on using Morse Code (iirc) and derives the Shannon Hartley equation. This says that the only limit to data rate in a given bandwidth is, in fact, the Signal to Noise Ratio. No system has been made to work very close to this limit but modern systems, using modulation methods that are matched to the nature of the likely noise characteristic of a channel, do a very good job. If errors are detected and corrected, the overhead for the extra data can well produce stunning figures for the useful channel capacity. http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-451-principles-of-digital-communication-ii-spring-2005/lecture-notes/chap4.pdf describes methods whereby the Shannon link can be approached. (If the theory is not familiar, it can be a bit demanding as its not intuitive).
 
Last edited by a moderator:
  • #4
Thanks for the replies, the reason I asked beyond the formula given in the video is that I've read previously that it isn't as simple as the frequency of the wavelength in hertz. Sophiecentaur if the answer isn't simply the frequency in bps then how else can bandwidth be calculated purely from frequency? Or is this not possible on it's own? I had a flick through the links you provided but couldn't really make heads or tails of them (though at the moment I don't have the time to read through the last link properly).
 
  • #5
Ryan_m_b said:
Thanks for the replies, the reason I asked beyond the formula given in the video is that I've read previously that it isn't as simple as the frequency of the wavelength in hertz. Sophiecentaur if the answer isn't simply the frequency in bps then how else can bandwidth be calculated purely from frequency? Or is this not possible on it's own? I had a flick through the links you provided but couldn't really make heads or tails of them (though at the moment I don't have the time to read through the last link properly).
There just is not a simple relationship between bit rate and channel bandwidth. As soon as you get beyond a binary data stream with a 'nice looking' waveform (limited to a generous bandwidth) you cannot say what you can get - except to use the Shannon formula to give an absolute limit. With a good enough signal to noise rate and the right signal processing, you can get data at a high rate from a really narrow band channel. (Arm waving and without numbers)
If you want a practical number to work with then you need to read around about the various systems and how they perform. Remember just how much you can compress digital audio signals into only a few kB/s. That is precisely the same thing at work. It has to depend upon the degree of and type of impairment that you can accept. Even the 'best' systems will let a glitch through every few years!
 

1. What is the "theoretical limit for data transfer given frequency"?

The theoretical limit for data transfer given frequency, also known as the Shannon-Hartley theorem, is a fundamental law in information theory that determines the maximum rate at which data can be transmitted over a communication channel without error. It takes into account the bandwidth of the channel and the level of noise present in the system.

2. How is the theoretical limit for data transfer given frequency calculated?

The theoretical limit for data transfer given frequency is calculated using the Shannon-Hartley theorem formula: C = B log₂(1 + S/N), where C is the channel capacity in bits per second, B is the bandwidth in hertz, S is the signal power, and N is the noise power. This formula provides the maximum achievable data rate for a given communication channel.

3. What factors affect the theoretical limit for data transfer given frequency?

The theoretical limit for data transfer given frequency is affected by various factors, including the bandwidth of the communication channel, the level of noise in the system, and the signal strength. Other factors that can impact the limit include the modulation technique used, the distance between the transmitter and receiver, and the type of encoding/decoding methods employed.

4. Can the theoretical limit for data transfer given frequency be surpassed?

No, the theoretical limit for data transfer given frequency cannot be surpassed. It is a fundamental law of information theory that represents the maximum achievable data rate for a given communication channel. However, this limit is constantly being pushed through advancements in technology and the use of more efficient communication techniques.

5. How is the theoretical limit for data transfer given frequency relevant in modern technology?

The theoretical limit for data transfer given frequency is highly relevant in modern technology, particularly in the development and optimization of communication systems. It provides a benchmark for the maximum data rate that can be achieved and helps in designing more efficient and reliable communication channels. This limit is also important in the study of information theory and its applications in various fields, such as telecommunications, computer networking, and data storage.

Similar threads

Replies
9
Views
1K
  • Programming and Computer Science
Replies
1
Views
975
Replies
3
Views
2K
Replies
7
Views
3K
  • Quantum Physics
Replies
2
Views
1K
  • Beyond the Standard Models
Replies
1
Views
2K
Replies
8
Views
1K
  • Electromagnetism
Replies
4
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
3
Views
2K
Replies
9
Views
948
Back
Top