# Coaxial cable power rating

I am trying to calculate the maximum power of a coaxial cable, based on heat transfer parameters and limiting temperatures of cable components. I've determined that a certain heat load on the conductor surface (in Watts per inch of cable) will put the dielectric above its operating temperature.

I am having trouble turning that heat load number into a power rating (I'm looking at attenuation numbers). Would a power rating be dependent on cable length? My calculations suggest that length is a factor, but every data sheet I look at disagrees.

sophiecentaur
Gold Member
2020 Award
Hi
You would probably find what you need in the detailed spec of the cable. The power rating of a cable will ultimately depend upon the actual operating frequency (the loss is frequency dependent), the thermal environment of the cable (clearance between cables and ambient temperature) and even the VSWR on the cable. But the manufacturer will have a conservative spec, which would be the one to use unless there is a really good excuse - e.g. cost or available duct space.
I couldn't be sure how W/inch would help (except for a very short piece of cable) as the thermal power dissipated externally AND the RF power dissipated internally would both be proportional to length - so the equilibrium temperature would be the same). Needless to say, it would be the start of the cable run where the most power would be dissipated. Once you were 3dB down, the power dissipated would be halved.

Interesting. I had been assuming that the power would be dissipated evenly over the length of the cable. What you said makes sense now that I understand attenuation more. So the front end (power-in end) will dissipate more power.

I'm still hung up on the effect of length. Thermal resistance and power dissipated externally are linearly dependent on length. Attenuation (in dB) is also linearly dependent on length, but it is a logarithmic ratio, so it seems to me that the length dependency doesn't wash out? I can post my math if that is not clear.

Every cable data sheet I can find lists a power rating as independent of length, either I'm doing it wrong or they made an assumption or two.

sophiecentaur
Gold Member
2020 Award
Think like this. The EM power will decrease exponentially with distance along the cable. This means that the power loss will also decrease exponentially so the heat loss per unit distance will also decrease at the same rate. The heat transferred away will be proportional to temperature difference and the temperature difference will decrease as you go further along the line. (i.e. the length dependency will "wash out").
This is the same situation as with radioactive decay and Capacitor discharge, the one one variable is the differential of the other and, with exponentials, the derivative is the same as the function- the rate of change is proportional to the value.

You could prove it to yourself numerically by drawing the cable out as a series of potential dividers - each one giving the same fraction as the previous one. You always have a 49Ω source, a 49Ω output load and a 1Ω series resistor 'at the top end' of the potential divider- and the 49Ω load becomes the source for the next potential divider etc. etc.. Crude but it would work, I think.
Plus, 50 thousand data sheets can't be wrong. haha

Are you free to revisit this? I sent you a private message.