# Understand the concept of bandwidth

fog37
TL;DR Summary
Concept of Bandwidth
Hello,
I am trying to clearly understand the concept of bandwidth.

Let's assume we have two type of cables and a signal ##x(t)## that travels a distance ##L## along each cable. The first cable has bandwidth ##BW1## and the signal has speed ##v1## along it. The other cable has bandwidth ##BW2## and speed ##v2##.
$$BW1>BW2$$
The bandwidth ##BW## of a cable represents the interval of spectral frequencies ,from ##f=0 Hz ## to ##f_{max}##, that are not significantly attenuated by the transmission through the channel. Let's assume that the signal ##x(t)## that is transmitted along each cable has a bandwidth ##BW2 <BW_{signal} <BW1##. The signal's bandwidth indicates how complex the signal is: if the signal is digital, the higher its bandwidth the more "data" is embedded/represented by the signal, correct? I think that when the signal travel along cable 2, it reaches its destination distorted (that does not happen with cable 1).

Does the speed ##v## of the channel matter at all? Is there any connection between the channel's bandwidth and the channel's speed? I am assuming the speed to be constant across the bandwidth. I believe that BOTH (speed and bandwidth) are critical, correct?

For example, fiber optics cables have both a higher signal speed and bandwidth when compared to copper cables. Is that why fiber allows for faster data transmission?

In both copper and fiber optics, the signal is modulating a high frequency carrier signal...

Thanks!

Staff Emeritus
This is not an adequate answer, but it may help. Pulses smear together when they are too close together and conditions aren't perfect. The imperfections set a bandwidth limit to the system.

fog37
Does the speed v of the channel matter at all?
No.
A transmission line can contain many bits of traveling data at the same time.
Data latency is increased for slower transmission lines, but the bandwidth is not reduced.
Longer packets of data can eliminate much of the effect of that latency.

fog37
fog37
Thanks!
So, for fiber optics, the main point for high data transmission is the available bandwidth more than the close to the speed of light speed at which light travels inside the fiber core.

Multiple optical carrier signals travel in the fiber at the same time each carrying signals with very large bandwidths without attenuation significantly affecting them...

Mentor
and a signal ##x(t)## that travels a distance ##L## along each cable.
IMO, it's very confusing to use ##x(t)## to represent a voltage waveform propagating down a communication cable. Perhaps use ##V(t)## or similar instead? The symbol "x" usually denotes a distance, not a voltage...

The bandwidth ##BW## of a cable represents the interval of spectral frequencies ,from ##f=0 Hz ## to ##f_{max}##, that are not significantly attenuated by the transmission through the channel.
It is not common for a communication channel to be baseband (including DC). It is much more common for a communication channel or cable to be optimized for the propagation of signals modulating carrier frequencies.

The signal's bandwidth indicates how complex the signal is: if the signal is digital, the higher its bandwidth the more "data" is embedded/represented by the signal, correct? I think that when the signal travel along cable 2, it reaches its destination distorted (that does not happen with cable 1).
Bandwidth and signal complexity are not really related. You can have a very complex Spread Spectrum transmission or use OOFDM in the same BW as you use simple AM signals. Each modulation scheme has different advantages and disadvantages and uses, and certainly OOFDM is more complex than AM...

Does the speed ##v## of the channel matter at all? Is there any connection between the channel's bandwidth and the channel's speed? I am assuming the speed to be constant across the bandwidth. I believe that BOTH (speed and bandwidth) are critical, correct?
"Speed" or more accurately propagation velocity is less important than "Dispersion" and "Loss versus Frequency". You will generally want your communication channel to not be dispersive across your target BW, and similarly you want your parasitic losses to also be relatively flat across the BW of interest in the channel.

For example, fiber optics cables have both a higher signal speed and bandwidth when compared to copper cables. Is that why fiber allows for faster data transmission?
Why do you say that optical fiber has an inherently different propagation velocity compared to coax or twisted pair cable? Do you have a reference for that?

Gold Member
Thanks!
So, for fiber optics, the main point for high data transmission is the available bandwidth more than the close to the speed of light speed at which light travels inside the fiber core.

Multiple optical carrier signals travel in the fiber at the same time each carrying signals with very large bandwidths without attenuation significantly affecting them...
yes. Electrical cables will still carry signals at something like 1/3 of the speed of light. While fiber optic cables will carry signals at about 2/3 the speed of light.

collinsmark and fog37
Mentor
Electrical cables will still carry signals at something like 1/3 of the speed of light
Not the cables I use...

DaveE and fog37
Mentor
Take for example your vanilla Ethernet cable...

fog37 and dlgoff
fog37
Take for example your vanilla Ethernet cable...

View attachment 300041
I see.

So, given a signal ##V(t)##: sequence of square pulses over a time interval of ##1s##. The more the pulses, the narrower the pulses must be and the larger the signal's bandwidth (Fourier theory).

I see your point about the speed of an electrical signal in copper vs the speed of light inside a fiber or in free space: the difference is not that much even if the speed is higher in the fiber.

Overall, the primary factors are: the optical carrier signals have a very carrier high frequency and the data signals they "carry" can have a very large bandwidth ##BW##. This allows for having many high frequency optical carriers carry signal with large bandwidth without interfering (wavelength division multiplexing, WDM).

In copper cabling, I think the carrier frequencies are lower, so, even if the carrier signals "transported" signals with large bandwidth, interference would be an issue.

berkeman
Mentor
So, given a signal ##V(t)##: sequence of square pulses over a time interval of ##1s##. The more the pulses, the narrower the pulses must be and the larger the signal's bandwidth (Fourier theory).
Not exactly. When you say "square pulses", it will be the rise and fall times of those pulses that will determine the required BW of the channel, not the repetition rate of the pulses.

Overall, the primary factors are: the optical carrier signals have a very carrier high frequency and the data signals they "carry" can have a very large bandwidth ##BW##. This allows for having many high frequency optical carriers carry signal with large bandwidth without interfering (wavelength division multiplexing, WDM).
Yes, the frequency of the carriers has a big effenct on the BW of the channel, as long as you use an appropriate modulation scheme.

In copper cabling, I think the carrier frequencies are lower, so, even if the carrier signals "transported" signals with large bandwidth, interference would be an issue.
Yes, optical THz frequencies are definitely higher than copper GHz frequencies.

fog37
From the physics standpoint, I think that a carrier signal, which is a continuous sinusoidal signal of frequency ##f_{carrier}##, can be modulated fast (ASK, PSK or other digital modulation technique) only if the frequency is high "enough". My point is that a high carrier frequency is good because

a) it helps avoiding interference with other carriers
b) it is required to modulate to be able to support a high rate modulation (high rate modulation corresponds with a high bandwidth information signal transported by the carrier signal)

For example, considering a carrier sinusoids with amplitude 5V, if we applied ON-OFF keying (0V and 5V) to it, the ON portion modulation would generate a pulse that needs to include at least one full period of the carrier... Chopping a low frequency carrier too fast would produce pulses that contains fraction of a period and would probably not good for detection...

Mentor
From the physics standpoint, I think that a carrier signal, which is a continuous sinusoidal signal of frequency ##f_{carrier}##, can be modulated fast (ASK, PSK or other digital modulation technique) only if the frequency is high "enough". My point is that a high carrier frequency is good because

a) it helps avoiding interference with other carriers
b) it is required to modulate to be able to support a high rate modulation (high rate modulation corresponds with a high bandwidth information signal transported by the carrier signal)

For example, considering a carrier sinusoids with amplitude 5V, if we applied ON-OFF keying (0V and 5V) to it, the ON portion modulation would generate a pulse that needs to include at least one full period of the carrier... Chopping a low frequency carrier too fast would produce pulses that contains fraction of a period and would probably not good for detection...
It is probably not fruitful for us to try to fill in small niches of information when you are lacking the overall framework of knowledge of communication systems and modulation. Please spend some time reading the Wikipedia article:

https://en.wikipedia.org/wiki/Modulation

https://www.abebooks.com/book-search/title/introduction-communication-systems/author/stremler/

There are many important aspects to communication theory that cannot be simply explained by "propagation velocity" or "bandwidth" or even "dispersion".

Once you have looked through these references, we would be happy to answer your specific questions about points raised in them. Thanks.

fog37 and anorlunda
Gold Member
So, given a signal V(t): sequence of square pulses over a time interval of 1s. The more the pulses, the narrower the pulses must be and the larger the signal's bandwidth (Fourier theory).
It's important to note that nobody (but a teacher with a whiteboard) actually deals in square pulses. If the (always) analogue devices that carry a digital signal can generate and detect 'square waves' then there's some bad engineering going on.

Any transmission system that's making efficient use of the available spectrum will introduce distortion and interference between pulses on the same signal. This is called inter-symbol interference and it can be easily dealt with using appropriate filtering. That can greatly reduce the effects of ISI. The process can involve looking at a long string of symbols either side of the wanted symbol so there may be a delay (mostly irrelevant).

The presence of noise will affect the reliability of demodulating the data but, again, an efficient system will use an appropriate data rate so that decision errors in decoding the values of the levels are infrequent enough to be detected and / or corrected in the coding. Looking at a scope trace of a clever system may show you just a blur of fuzzy lines but the system will give a very low error rate. Not a square wave in sight.
Chopping a low frequency carrier too fast would produce pulses that contains fraction of a period and would probably not good for detection...
This is the same sort of effect that you get with oversampling of a video or audio signal. The Nyquist criterion imposes a limit that is possible without producing aliases folded back amongst the wanted signal spectrum.

The high data rates available on optical systems are largely because the fractional bandwidth at optical frequencies can be very wide.
It is probably not fruitful for us to try to fill in small niches of information when you are lacking the overall framework of knowledge of communication systems and modulation.
I agree. Do lots of reading round this huge topic.

Staff Emeritus
Any transmission system that's making efficient use of the available spectrum will introduce distortion and interference between pulses on the same signal.
That reminded me of a favorite anecdote. It is marginally off-topic but I can't resist.

From Dyer, Frank Lewis; Martin, Thomas Commerford. Edison, His Life and Inventions

Edison was now asked if he thought he could get a greater speed through submarine cables with this system than with the regular methods, and replied that he would like a chance to try it. For this purpose, twenty-two hundred miles of Brazilian cable cable then stored under water in tanks at the Greenwich works of the Telegraph Construction & Maintenance Company, near London, was placed at his disposal from 8 P.M. until 6 A.M. "This just suited me, as I preferred night-work. I got my apparatus down and set up, and then to get a preliminary idea of what the distortion of the signal would be, I sent a single dot, which should have been recorded upon my automatic paper by a mark about one-thirty-second of an inch long. Instead of that it was twenty-seven feet long! ... What I did not know at the time was that a coiled cable, owing to induction, was infinitely worse than when laid out straight, and that my speed was as good as, if not better than, with the regular system; but no one told me this."

That led to Edison discovering a ceiling on his bandwidth. "I worked on this cable more than two weeks, and the best I could do was two words per minute, which was only one-seventh of what the guaranteed speed of the cable should be when laid."

berkeman
That reminded me of a favorite anecdote. It is marginally off-topic but I can't resist.
When it comes to distortion on long lines the hero has to be Oliver Heaviside, the telegrapher and amateur mathematician who came up with the telegrapher's equations, patented coaxial cable, and reformulated Maxwell's equations into their current form.
https://en.wikipedia.org/wiki/Oliver_Heaviside
In 1887, Heaviside worked with his brother Arthur on a paper entitled "The Bridge System of Telephony". However the paper was blocked by Arthur's superior, William Henry Preece of the Post Office, because part of the proposal was that loading coils (inductors) should be added to telephone and telegraph lines to increase their self-induction and correct the distortion which they suffered. Preece had recently declared self-inductance to be the great enemy of clear transmission.

anorlunda
Staff Emeritus
That sounds like shunt inductors, not series inductors. The Edison bio about light bulbs and power companies also showed evidence of nearly universal lack of understanding between series and parallel, series and shunt.

artis
@fog37 Let me try to jump in a bit here.

First of all from what I know the typical electrical signal propagation velocity for a copper cable is about 2/3 or 0.7c , but as others already said it's not about the speed of light (EM) within the conductor but about the frequency.

Actually if you think about it, all EM waves travel at c in vacuum , even the VLF ones used for submarine communications that are in tens to hundreds of Hz range (audible frequency bass range)
But as you probably already know they have very small bandwidth but it's not because of their propagation velocity rather simply because of the low frequency.

So here would be a rule of thumb
1) Propagation velocity - latency (time it takes for your message/signal to arrive, somewhat like the speed of a post pigeon)
2) Bandwidth - amount of information one can send in a fixed time. (sort of like how many pigeons you can make fly at the same time)

The reason I think higher frequency allows for more bandwidth is simple, frequency is just the number of times the EM wave makes a full cycle within a given time period , usually taken as one second.
So the more times your sine wave goes "up and down" within that one second the more cycles you have for encoding information.

A simple analogy would be a light house trying to send info with Morse code.
It can send the code by using a maximum of 1 cycle per second, that would be switching on/off no faster than once a second so adding the delays for the code to be meaningful that would mean a short bit of code within 10 seconds, or it could use say a laser and switch the laser on/off 1 billion times a second, again add the delays and now you can send a whole book within those 10 seconds.

The rule is that the higher in frequency you go the more times you can switch between on/off within a second and the more information therefore you can encode within that time.
So IIRC this is why say a 5Mhz bandwidth for an optical (visible range) can hold more info than the same 5Mhz band at Microwave frequencies if identical modulation method is compared.

This is the reason I think why they went from 4G to 5G, more bandwidth in a smaller frequency range, more channels available etc.

fog37
artis
So, given a signal : sequence of square pulses over a time interval of . The more the pulses, the narrower the pulses must be and the larger the signal's bandwidth (Fourier theory).
From what I know in high frequency "square" waveforms don't exist, if we talk about electrical signals, then they involve current, and current cannot change instantly, it takes time for it to change, time for a signal is the "rise time" and "fall time" and all in between that you see on scope.
At RF frequencies the current change is so fast that any small inductance will slow it down considerably, and all circuits have some inductance therefore the rise time and fall time resembles a sine wave not a square one.

I do not know about light and lasers, I would think a laser pulse can have a faster rise time than an electrical impulse traveling down a coax but I'm not sure about that, and in the end the laser (whether diode laser or otherwise) is also pumped and controlled by an electrical circuit so in the end I think the rise time of the light output is tied to some proportionality to the speed of the electric circuit controlling the laser, but I wish others more knowledgeable about that can comment?

Gold Member
I do not know about light and lasers, I would think a laser pulse can have a faster rise time than an electrical impulse traveling down a coax but I'm not sure about that, and in the end the laser (whether diode laser or otherwise) is also pumped and controlled by an electrical circuit so in the end I think the rise time of the light output is tied to some proportionality to the speed of the electric circuit controlling the laser
I'm not a laser expert either, but I've worked with many. Yes lasers can be faster, with pulse widths measured in femtoseconds. But, those aren't modulated at those rates, that's more like an oscillator. And you can't easily send them a long way on a transmission line (fiber). Yes the electronics is often the limiting item. As far as transmission down fibers, dispersion is a bigger issue that with "radio" signals. Both from the behavior of the material and the path variations as the light is guided down the fiber. Nobody does real square waves if they want maximum speed. It's all about rise/fall times, jitter, and eye diagrams, with light or electronic transmission. The information is really contained in the rise/fall; if the signal is at a high or low level, there's not much point staying there any longer than the receiver needs.

Last edited:
berkeman
Mentor
Yeah, I spend a lot of time measuring eye diagrams (oscilloscope accumulation mode) in my transceiver and channel design work.

https://www.edn.com/eye-diagrams-the-tool-for-serial-data-analysis/

Oldman too
artis
Yes lasers can be faster, with pulse widths measured in femtoseconds. But, those aren't modulated at those rates, that's more like an oscillator.
I suppose they acquire those pulses like explained in the video below , please see from 7:44 onwards

Bit later he mentions this method as "phase aka mode locking" is this what you meant? I suppose so.
PS. I just looked up dispersion in fiber , really interesting phenomena.

Gold Member
I suppose they acquire those pulses like explained in the video below , please see from 7:44 onwards

Bit later he mentions this method as "phase aka mode locking" is this what you meant? I suppose so.
PS. I just looked up dispersion in fiber , really interesting phenomena.

Yes, that's a good video, but "fundamental" as the title says. That will give you picosecond pulses. There's more required to get to femtoseconds. It gets extremely complex to reach the state of the art, as you can imagine. The materials used can be quite exotic, with extreme requirements.

If you want to learn a bit about the next step look into "chirp pulse amplification" which uses and/or compensates for dispersion effects to narrow pulse widths. This was worth a Nobel prize in 2018 (from work in the 1980's).

I believe the current record is 43 attoseconds (43x10-18 sec). The fastest thing mankind has ever done.

Last edited:
artis and berkeman
fog37
@fog37 Let me try to jump in a bit here.

First of all from what I know the typical electrical signal propagation velocity for a copper cable is about 2/3 or 0.7c , but as others already said it's not about the speed of light (EM) within the conductor but about the frequency.

Actually if you think about it, all EM waves travel at c in vacuum , even the VLF ones used for submarine communications that are in tens to hundreds of Hz range (audible frequency bass range)
But as you probably already know they have very small bandwidth but it's not because of their propagation velocity rather simply because of the low frequency.

So here would be a rule of thumb
1) Propagation velocity - latency (time it takes for your message/signal to arrive, somewhat like the speed of a post pigeon)
2) Bandwidth - amount of information one can send in a fixed time. (sort of like how many pigeons you can make fly at the same time)

The reason I think higher frequency allows for more bandwidth is simple, frequency is just the number of times the EM wave makes a full cycle within a given time period , usually taken as one second.
So the more times your sine wave goes "up and down" within that one second the more cycles you have for encoding information.

A simple analogy would be a light house trying to send info with Morse code.
It can send the code by using a maximum of 1 cycle per second, that would be switching on/off no faster than once a second so adding the delays for the code to be meaningful that would mean a short bit of code within 10 seconds, or it could use say a laser and switch the laser on/off 1 billion times a second, again add the delays and now you can send a whole book within those 10 seconds.

The rule is that the higher in frequency you go the more times you can switch between on/off within a second and the more information therefore you can encode within that time.
So IIRC this is why say a 5Mhz bandwidth for an optical (visible range) can hold more info than the same 5Mhz band at Microwave frequencies if identical modulation method is compared.

This is the reason I think why they went from 4G to 5G, more bandwidth in a smaller frequency range, more channels available etc.
Thanks!

Conceptually, let's consider a free space signal, i.e. an electromagnetic field with bandwidth ##B## centered at the origin ##f=0 Hz##. We could:

a) directly transmit the signal as is through the air (no modulation at all)
b) modulate a carrier signal with frequency ##f_{carrier}## to generate a signal with bandwidth ##B## centered at ##f_{carrier}##

We always choose option b) but option a), I believe, is not unfeasible. In general, the frequency ##f_{carrier}>>B## but we could also have ##f_{carrier}< B ##. The problem with ##f_{carrier}< B ## is that if multiple modulated signals were transmitted through the air, it would become technically impossible to separate them at the receiving end. In the frequency spectrum, the bandwidth of one carrier would overlap with the bandwidth of another carrier. So I think that the requirement##f_{carrier} >>B ## ensures that the different carriers can carry signals with bandwidth ##B## without overlap in the frequency domain and without causing issue at the detection end.

A higher frequency carrier can in principle be modulated to carry an information signal with larger bandwidth. But I believe that, in practice, a GHz carrier generally carries a signal with the same bandwidth as a THz carrier. However, in the GHz region of the spectrum, we can transmit less carriers so as to avoid that their carried bandwidths overlap. The larger the ratio ##\frac {f_{carrier}}{B}## the better.

So, yes, a higher carrier allows a higher modulation rate, hence the ability to carry an information signal with more data. But the ability to send more carriers without interference seems to be the even more important...

Am I going somewhere with this or completely off? I have do some reading/studying (never enough, I know).

artis
Thanks!

Conceptually, let's consider a free space signal, i.e. an electromagnetic field with bandwidth ##B## centered at the origin ##f=0 Hz##. We could:

a) directly transmit the signal as is through the air (no modulation at all)
b) modulate a carrier signal with frequency ##f_{carrier}## to generate a signal with bandwidth ##B## centered at ##f_{carrier}##

We always choose option b) but option a), I believe, is not unfeasible. In general, the frequency ##f_{carrier}>>B## but we could also have ##f_{carrier}< B ##. The problem with ##f_{carrier}< B ## is that if multiple modulated signals were transmitted through the air, it would become technically impossible to separate them at the receiving end. In the frequency spectrum, the bandwidth of one carrier would overlap with the bandwidth of another carrier. So I think that the requirement##f_{carrier} >>B ## ensures that the different carriers can carry signals with bandwidth ##B## without overlap in the frequency domain and without causing issue at the detection end.

A higher frequency carrier can in principle be modulated to carry an information signal with larger bandwidth. But I believe that, in practice, a GHz carrier generally carries a signal with the same bandwidth as a THz carrier. However, in the GHz region of the spectrum, we can transmit less carriers so as to avoid that their carried bandwidths overlap. The larger the ratio ##\frac {f_{carrier}}{B}## the better.

So, yes, a higher carrier allows a higher modulation rate, hence the ability to carry an information signal with more data. But the ability to send more carriers without interference seems to be the even more important...

Am I going somewhere with this or completely off? I have do some reading/studying (never enough, I know).
Your pretty much doing ok there.
Just a couple of notes, the reason we use high frequency carrier and low frequency data signal is not just because high frequency helps with smaller antennas and lower power transmitters but also because most of the signals that we interact with , like audio etc are of low frequency. In theory you could transmit audio as is but you would need a large antenna and big power.
Essentially a 50hz powerline , and powerlines also radiate radiowaves by the way and at 50hz,or 60hz.

Now I might be mistaken here but I think one of the reasons at least back in the day why we only used high frequency for the carrier but lower frequency for the actual data is because it is not easy to make practical high power electronics that can work in Ghz frequencies for example.
Imagine you have a radar receiver, like those in cars for police radars, you receive a tiny signal with low strength, you need to amplify it and later signal process it. It is easier to amplify and process lower frequency signals than higher frequency ones.

As for the increase in frequency I think it is only part of the "game", the other part is switching from analog transmission to digital where the information modulation becomes more effective and you need less bandwidth for the same amount of data. It is the combination of this "space reduction" with increasing frequency that really gives us the ability now to stream endless info over air.
If 5G phone networks switched from digital modulation schemes back to analog I doubt you could get the download/upload speeds one gets now.

fog37