# I How far away can 1KW of RF energy be detected?

#### philip porhammer

Summary
If you put a parabolic antenna in space, haw far away could a signal be detected?
Summary: If you put a parabolic antenna in space, haw far away could a signal be detected?

If the size of an antenna and frequency were not limited, how far could a 1KW RFsignal be detected?
Would it be under the -120dBm noise floor? In space does the noise floor go down?
assuming also that the other end also has a huge space antenna....

Related Other Physics Topics News on Phys.org

#### Delta2

Homework Helper
Gold Member
I see the voyager 2 has a maximum power of 0.5KW (not sure if the telecommunications subsystem has a separate power unit) and NASA is still able to communicate with it at a distance of about 120 AU=18 billion Kilometers.

#### philip porhammer

I'm thinking in Light years....like what SETI is looking for RF signals

#### russ_watters

Mentor
I'm thinking in Light years....like what SETI is looking for RF signals
Can you use the information you were just given to make some calculations...?

There are also resources on SETI itself that you can find with a google.

#### sophiecentaur

Gold Member
Summary: If you put a parabolic antenna in space, haw far away could a signal be detected?

In space does the noise floor go down?
It is a multi faceted engineering problem. Noise is highly relevant and the "120dB noise floor" quoted will refer to a certain system bandwidth. Go to 1/10 of the bandwidth and the noise floor will go down by 10dB. Whether or not that bandwidth is of any use is another question.
The noise is largely in the first stage of amplification but interference from the Sun can be significant if it's near the boresight (and on-board sources). Terrestrial reception can be affected by thermal noise from the warm atmosphere and atmospheric signal absorption (rain!!!!) and also even the best / biggest reflecting dishes on Earth can be subjected to thermal noise currents, due to the hot ground 'creeping' round the edges of the reflector and contributing to receiver noise.
As far as I can find, deployable space borne dishes at present are no bigger than perhaps 8m diameter which is nothing like as big as the largest terrestrial dishes. The 1kW source would need as large a reflector as possible - just the same as the receiving end.

1kW would be a very high power system to operate a long way from the Sun because solar PV would not work and you'd be stuck (as all probes are) with the need for realistic levels of available Power.

#### RPinPA

Homework Helper
Noise is thermal in origin, so a spacecraft in deep space operating near absolute zero has a lower noise floor than one at ambient temperature on earth. If the temperature is about, say 270 K less than on earth, that gives you another 24 dB lower noise floor.

Thermal noise is a certain amount of energy per Hz, so as @sophiecentaur says, less bandwidth gives you less noise.

Finally, you can integrate over time, adding up the signal over many, many repetitions. This can buy you more SNR through what's called "integration gain". In theory there's no limit if you're willing to do enough integration though in practice you probably want to send more than a bit per month or so.

Let's just arbitrarily say you get another 40 dB of SNR from these techniques. Since signal falls as $1/r^2$, that means that $r$ can increase by 20 dB, i.e. that you can go 100 times farther.

#### sophiecentaur

Gold Member
Finally, you can integrate over time, adding up the signal over many, many repetitions.
That's equivalent to reducing the bandwidth for a given method of coding.

" that means that r can increase by 20 dB, i.e. that you can go 100 times farther."
The inverse square law is great value in that respect! So much better than losses that are proportional to distance - as with cables.

#### RPinPA

Homework Helper
That's equivalent to reducing the bandwidth for a given method of coding.

" that means that r can increase by 20 dB, i.e. that you can go 100 times farther."
The inverse square law is great value in that respect! So much better than losses that are proportional to distance - as with cables.
You're right about bandwidth, I hadn't thought of it that way, but long integration = low data rate = low bandwidth of course.

To the OP, @sophiecentaur suggested at #5 that you look up "link budget". That's what you're asking about and that will tell you the considerations that go into figuring out how to make a communication system achieve a given range.

Directional antennas show up as a gain factor in link budget calculations, as they give you gain over non-directional antennas. Antenna gain is related to the ratio of aperture to wavelength. You aren't going to get much gain with an antenna only a few meters across at radio frequencies. But perhaps you could synthesize a larger aperture by combining several widely spaced antennas (this is done with radio telescopes on earth), or go to shorter wavelengths, such as light. If this is a sci-fi concept, you could hypothesize a civilization who managed to use X-rays or gammas for communication for even more gain.

#### sophiecentaur

Gold Member
But perhaps you could synthesize a larger aperture by combining several widely spaced antennas
That's a great technique for improving the resolution of sources (interferometry) but there is no substitute for brute force and AREA for gathering Power to improve SNR.

Just to get things in proportion here; the Voyager 1 probe, which has just gone 'interstellar' (19billion km away) uses a 22.5W power source and some very aged equipment. We are still getting a trickle of useful data back.

#### stefan r

Gold Member
I'm thinking in Light years....like what SETI is looking for RF signals
The range depends on the nature of the source. A laser shows up much further away than a porch light if they have the same power supply. A focused laser shows up further away.

"a 2-megawatt laser pointed through a 30-meter telescope could create a strong enough signal to reach Proxima Centauri B. A laser with half that power—only 1 megawatt—if directed through a 45 meter telescope, would be visible to alien astronomers in the TRAPPIST-1 system."
The signal just needs to be brighter than the Sun in the infra-red frequency. That makes it visible anywhere that the Sun is visible. The target star/planet would need to have a 1 meter telescope focused on the Sun.

Radio frequency is generally lower energy than infra-red.

how far could a 1KW RFsignal be detected?
...
assuming also that the other end also has a huge space antenna....
"huge" is not a scientifically precise unit of measure. You could, for example, take a cubic kilometer of asteroid iron and make it into 1 mm dish material to get a 1012 m2 surface. Asteroids like Vesta are much larger. There is no known reason why a planet like Mercury could not be made into a satellite dish or swarm of dishes. Giant molecular clouds have many solar masses of metals that can be made into antenna and are already conveniently spread out over several parsecs.

This paper says on page 207 that 500 light years, 5 gigahertz, Arecibo dish to Arecibo dish communication can be done with 4,623W.

#### sophiecentaur

Gold Member
This paper says on page 207 that 500 light years, 5 gigahertz, Arecibo dish to Arecibo dish communication can be done with 4,623W.
It may not be quite so straightforward when the proper motions of Tx and Rx are involved and the beam widths of the antennae are narrow. For a 300m dish at 5GHz, the beam width is less than 1 degree but at higher (optical) frequencies, pointing problems could well be a problem. Antenna gain advantage has to run out at some point.
Maybe 500 years of relative motion could be dealt with, though.

#### Paul Colby

Gold Member
From the statement of the question, isn't detection simply a matter of power level above thermal background? Transmission of information is a much more involved question than just radiometry. Kraus's book on antennas has a very good discussion of this.

In general EM energy originating from our planet will compete with the output of our star over the bandwidth being broadcast. Transmitters such as radios or lasers pump much more energy into a given bandwidth than the sun. They effectively have extremely high temperatures over a small bandwidth. All power drops as $1/R^2$ so the relation of source effective temperatures remains the same between transmitter and the sun. SETI searches based on lasers look for coincidences between received photons in a small band. 4 or 5 photons of one color detected within some small time indicates a non-natural source or so the argument goes.

#### stefan r

Gold Member
From the statement of the question, isn't detection simply a matter of power level above thermal background? Transmission of information is a much more involved question than just radiometry. Kraus's book on antennas has a very good discussion of this.

In general EM energy originating from our planet will compete with the output of our star over the bandwidth being broadcast. Transmitters such as radios or lasers pump much more energy into a given bandwidth than the sun. They effectively have extremely high temperatures over a small bandwidth. All power drops as $1/R^2$ so the relation of source effective temperatures remains the same between transmitter and the sun. SETI searches based on lasers look for coincidences between received photons in a small band. 4 or 5 photons of one color detected within some small time indicates a non-natural source or so the argument goes.

That sounds basically like the strategy suggest by James R Clark with the megaWatt laser and a 40 meter telescope. He thought you only need 0.1% of the Sun's brightness in infra-red near 1315nm. He used the Air Force's airborne laser project as a model for the source. 1/1000 should be enough to exceed the signal to noise ratio.

So how narrow does a bandwidth have to be in order for 1 kW to be brighter than the Sun? Is it possible to produce an infinitely narrow band and detect an infinitely narrow band? That would make communication a lot easier. Can we do narrow band searches?

#### sophiecentaur

Gold Member
Is it possible to produce an infinitely narrow band and detect an infinitely narrow band? That would make communication a lot easier. Can we do narrow band searches?
Unfortunately you need finite bandwidth to transfer a finite amount of information. You have to decide what information rate you want first and that more or less determines the bandwidth you need. That determines the noise from the star your transmitter is near which determines the transmitter power you need for "communication".

There are practical considerations to this. The transmitted information rate needs to be great enough for a random being to 'notice' that something's going on that's not just random. It would be necessary for the two civilisations to be at a similar stage in development so that one could receive and recognise that a message had been sent. To establish a conversation, you would need the lifetimes of the two civilisations to overlap by several times the number of light years they are apart. It's another couple of parameters to bolt into an updated version of the Drake Equation.

The overlap times for two planets both to have some life forms on them, simultaneously could be billions of years but coincidence window of two civilisations to be in their 'communication ages' is a matter of only perhaps a thousand years.

Hopefully someone will, before too long, discover some primitive wee beasties under the surface of Mars and that will answer the question as to whether Earth is the only cradle of 'life'. Then, hopefully, things will settle down a bit and the question of saying hello to chattering aliens will be put into abeyance for a while.

#### f95toli

Gold Member
Noise is thermal in origin, so a spacecraft in deep space operating near absolute zero has a lower noise floor than one at ambient temperature on earth. If the temperature is about, say 270 K less than on earth, that gives you another 24 dB lower noise floor.
No, it is not that easy. There is no simple relationship between the noise temperature of an RF system and the ambient temperature (unless that system is a 50 ohm resistor). A microwave amplifier with noise temperature of 300K would be extremely bad. A normal low-noise amplifier with a 1 GHz BW will typically have a noise temperature of about 40K or so.
It is true that you can get better performance by using cryogenic amplifiers and then you can get noise temperatures of 2.5-4K, but you typically only need to cool the amplifier to about 20K for that to happen (below that temperature you are limited by the self-heating of the HEMT).

The signal from Voyager is actually still(?) amplified using a MASER; and the effective noise temperature of that is very low.

#### sophiecentaur

Gold Member
There is no simple relationship between the noise temperature of an RF system and the ambient temperature (unless that system is a 50 ohm resistor).
The (50Ω) value of the resistor is not basically relevant. Every resistor has a noise power that's KTB. The secret is to match the signal power as well as possible to the amplifier and to provide enough front end gain that noise from the subsequent stages is diluted. Noise figure represents the increase over the kTB noise that the amplifier introduces.

Edit: but that's not the whole story, as someone is bound to point out.

#### f95toli

Gold Member
True, I just assumed that we were dealing with a 50 ohm system

#### Paul Colby

Gold Member
No, it is not that easy.
Agreed, however, cooling the electronics is not the only approach depending on what was is trying to do. For example I recently experimented with correlating signals between two software defined radios. With averaging (okay a lot of averaging) I was able to detect signals (as in harmonic tones) 30 to 40 dB below the noise floor of each radio. Basically the thermal noise in each radio is statistically independent of the other. If one is just looking for broadcast energy, this is one potential detection scheme.

#### sophiecentaur

Gold Member
True, I just assumed that we were dealing with a 50 ohm system
To do it best, you need to match the source resistance (transformed radiation resistance of the antenna) to the inherent impedance of the amplifier - to maximise signal power; the noise power from the front end resistance is 'what it is'.

#### f95toli

Gold Member
To do it best, you need to match the source resistance (transformed radiation resistance of the antenna) to the inherent impedance of the amplifier - to maximise signal power; the noise power from the front end resistance is 'what it is'.

True. But most microwave components have built in matching networks, meaning they have -from the outside- 50 ohm devices; and AFAIK the same is true for for most antennas or otherwise you wouldn't be able to use 50 ohm coax to connect it to your amplifier. In my "world" is it very rare for anything to not have 50 ohm output/input impedance.
That said, this is is of course not always true. i do have a an old custom-made 4-8 GHz cryogenic amplifier (about 4.5K noise temperature) which was designed to dissipate as little power as possible meaning it does not have any resistors or diodes inside the box (the rest of the electronics is in a separate box at room temperature) and that is not even close to 50 ohm; an external 50 ohm isolator is therefore used before the input to suppress reflections.

However, most of our newer amplifiers have 50 Ohm impedance which makes life easier, and the best ones we have do have a noise temperature of about 2.5K which is pretty much state of the art unless you move to quantum limited parametric amplifiers (which we also have)