# B How is noise removed in radio telescopes?

Tags:
1. May 20, 2018

### Phys12

For an optical telescope, light pollution is a problem and that's why put ground-based telescopes in remote places where there isn't much light. However, aren't we bombarded by radio waves from satellites for our phones and TVs? How do radio telescopes remove those?

2. May 20, 2018

### Drakkith

Staff Emeritus
Many radio telescopes are located in regions shielded from ambient radio and microwave radiation. They can be located in valleys, far away from cities, etc. The Green Bank Telescope, for example, is located in a national radio quiet zone.

3. May 20, 2018

### Phys12

I see, so the same principle and solution applies to radio telescopes, thanks!

4. May 20, 2018

### davenn

and just like for optical telescopes, radio telescopes can also employ filters to block signals other than the ones they want to receive

Noise generated by the heat generated in the electronics can be substantially reduced by super cooling the receiver parts with liquid nitrogen

....... radio telescopes operate on different frequencies from the things you mentioned

Dave

5. May 20, 2018

### Cathal

Radio-astronomy systems usually operate close to theoretical noise limits. With a few exceptions, signals are usually extremely weak. One such exception is the Sun. Depending on frequency, Solar cycle, antenna size, and system noise temperature, pointing an antenna at the Sun normally increases the received power several fold. Toward other sources, it is not unusual to detect and measure signals that are less than 0.1% of the system noise. In order to detect and measure signals that are a very small fraction of the power passing through the receiver, signal averaging or integration is used. If the receiver gain were perfectly stable, our ability to measure small changes in signal is given by the noise equation. Another powerful technique for extracting weak signals from noise is correlation. The radio telescope in this case has two or more receivers either connected to the same antenna, or, more often, two or more separate antennas. The signal voltages are multiplied together before averaging instead of multiplying the signal voltage by itself to obtain the power.

6. May 20, 2018

### Drakkith

Staff Emeritus
As far as I am aware, those frequencies still provide noise even though the telescope isn't operating on that frequency. I could be wrong though.

7. May 20, 2018

### davenn

if the scope is close to the source, yes, which is why, as you said in your post, most radio telescopes are built in relatively noise free areas.
filtering takes care of most of the rest of the noise

8. May 21, 2018

### nikkkom

They reduce noise by various clever methods.

A rather simple example: if you are measuring signal from a compact source (e.g. a distant galaxy), you can have two identical receivers, one getting signal from the source, and another from "empty space" nearby (say, ~0.1 degree away). Terrestrial and Solar system noise sources affect both receivers equally, since receivers are identical, and they use the same antenna at the same time. Subtract the signals, and you can get rid of those types of noise.

9. May 21, 2018

### Drakkith

Staff Emeritus
You bring up an interesting concept. From my astrophotography experience, 'noise' isn't removable. 'Extraneous signal' is removable, but the noise that this unwanted signal brings is not. By noise I mean the inherent random variation in the received signals, wanted or unwanted. I'm not sure how radio astronomy defines noise, so I can't tell you how they handle it.

10. May 21, 2018

### Barakn

You're right. Consider the mysterious signal known as the peryton that plagued the Parkes Radio Telescope. It turned out to be microwave oven users opening the microwave door prematurely, before the magnetron was able to de-power. While microwave ovens have a target frequency, the opening door at first only allows smaller wavelength, higher frequency waves out and then as the opening gets bigger, longer wavelength, lower frequency waves get out. These down-frequency radio sweeps mimicked Fast Radio Bursts.

11. May 21, 2018

### davenn

since you touched on the subject

There are 2 main sources of noise in modern astrophotography ( digital imaging)
Both of them can be quite successfully dealt with.

1) manmade light pollution …. use dark site and or filters …. guys are doing deep space imaging even during full moon from suburbia
stacking multiple (long = 5 minute exposures) images also substantially reduces random noise

2) noise generated by the sensors themselves

cooling the sensor as in the example below. Doing dark frames and subtracting them from the lights gets rid of hot pixels

Dave

12. May 22, 2018

### nikkkom

Sure. But my example was rather trivial. Very clever people spend decades thinking about it, and they invented ever more non-obvious ways to tease signal from the noise. Another example is: https://en.wikipedia.org/wiki/Closure_phase

13. May 22, 2018

### Drakkith

Staff Emeritus
Sorry, I should've spoken better. I meant that once an image is taken the noise can't be removed (as far as my limited knowledge tells me).

14. May 22, 2018

### Phys12

Wait, I thought that you could do that. That's why you take the flat and bias frames and then subtract them from the image taken to create the master bias and master flats. Is that not true?

15. May 22, 2018

### davenn

well that's why you do the things I commented on

( analog imaging on film and no, you are pretty screwed …. digital imaging leaves you wide open for
noise removal)

taking multiple images and stacking those images substantially reduces random noise

Adding "Dark Frames" to the stack removes ALL the hot pixels

and some creative post processing on that final stacked image results in a very noise free image

this is an example of a very noise free amateur astro image …..

that's 81 minutes of accumulated exposure time
Credit Franky T Astro Cop‎ an Australian astronomy mate from up north

cheers
Dave

16. May 22, 2018

### Drakkith

Staff Emeritus
Neither of those remove noise, they just remove non-uniform 'signal'. By signal I mean the actual electrons generated by incoming photons and the bias added by the sensor. The noise is the random variation in this electron count that causes a 'grainy' image.

It increases the SNR, but the amount of noise as I defined above actually increases. Luckily the signal increases linearly while the noise typically increases as the square root of the signal, which is why the SNR rises.

It removes the unwanted signal from these pixels, but the noise is not removed.

17. May 22, 2018

### davenn

your idea of noise in an image and my idea seem to be very different
I really cant figure where you are going ?

noise is noise is noise, regardless of how it is generated and the processes mentioned do lots to remove those various types of noise.

18. May 22, 2018

### davenn

yes it is true

I don't know if Drakkith thinks that I think that all … 100% … of noise can be removed. Of course not and I am not stating or advocating that

But the noise can be substantially reduced by various methods that I have stated, including the use of bias frames that help remove the bias noise signal generated by the sensor

from Cloudy Nights forum

The sensors coming out these days have extremely low bias noise that many guys are not even bothering with bias frames to lower electron noise

this is the comments for one of my cameras

Cameras of 5 - 20 years ago suffered from a lot of self generated noise. These days it is hardly an issue
even just in the last 5 years, the technology has advanced well

Dave

19. May 22, 2018

### Drakkith

Staff Emeritus
Mine comes from my book on astronomical image processing and is more technical than what I've usually seen describe noise.
Imagine taking two images of the same starfield. These two images have identical exposure times, identical filtering, were taken with the same camera, the same software, at the same location, etc. If you were to inspect every pixel on both images you find that these pixels are not identical. The number of electrons counted from each pixel on one image is slightly different than the number of electrons from corresponding pixels on the other image (I say electrons and not photons because that's what's physically being counted by the detector). If you were to take a third image, you would find that, again, all of the pixels have slightly different values. This variance is noise and while you cannot predict the exact value a pixel will have between successive images, it follows a certain statistical pattern, namely that the noise varies as the square root of the signal.

That signal could be the actual photons being captured by the sensor (either from your target or from stray or unwanted light), the electrons generated in the sensor by dark current, or the electrons generated by the onboard electronics. All of these things serve as 'signals' and all contribute their own noise to the resulting image. Basically anything that generates electrons in the detector is a signal.

Subtracting dark frames, flat frames, or bias frames does not subtract the noise added to the image from dark current, from bias, or from the flat, dark, and bias frames taken to do the subtraction. The reason that these are taken and subtracted is to remove the signal from each of these sources, in addition to fixing hot/cold pixels. That leaves you with, ideally, the signal from your target, the signal from the background and ambient light, and the noise from all of the sources.

Taking multiple exposures and adding/averaging them together does the same thing that taking a longer exposure would do. It increases the signal to noise ratio. If we examine the same pixel from images taken of the same object at 5, 10, and 20 second exposures we would find that the pixel value increases approximately linearly, with the 10 second image having twice the signal as the 5 second image and the 20 second image having 4 times the signal. However, the noise does not increase linearly. The noise in the 10 second image is only $\sqrt{2}$ times the noise in the 5 second image, and the noise in the 20 second image is $\sqrt{2}$ times as much as in the 10 second. So increasing the exposure time from 5 seconds to 20 has increased the signal by 4x but the noise by only $\sqrt{2}*\sqrt{2}$, or 2x. Hence the SNR has increased by a factor of 2 also. Stacking images is almost identical except for the fact that the readout noise of the sensor is comparatively higher than it would be if you just increased the exposure time.

Narrowband filtering 'removes noise' by blocking all of that pesky background light that you don't want which would only add lots of noise to the image. After all, you'd be able to subtract some quantity from all the pixel values of the image digitally so that this background light wasn't visible except that the noise can potentially be larger than the signal of your target! That's why shooting in heavy light pollution is so bad. The target's signal is swamped by the inherent noise of the ambient light.

For example, when shooting from inside a city, in a 30 second exposure I might see pixel counts of more than 40,000 electrons per pixel all across my sensor. The noise inherent with this background light is roughly $\sqrt{40,000}$, or ±200 e. Technically I should mention this is a mean, since it's a random variation about some central value. The fluctuation in the measured value of a particular pixel after many different exposures would have a high chance to be within 200 e of that 40,000, a slightly lower chance to be a little more than 200 e above or below, an even lower chance to be a bit further beyond that range, etc. When I compare this to the expected signal of my target, which may only be a hundred electrons per pixel over that 30 seconds or even less, you can see that the variation per pixel because of the noise can much larger than the signal from my target. This is what it means for a signal to be buried in the noise.

I hope I've made myself a bit clearer now.

20. May 22, 2018

### Fred Wright

Dear Drakkith,
Do people use maximum entropy signal reconstruction for astronomy or is its noise reduction effectiveness considered marginal?

21. May 22, 2018

### Drakkith

Staff Emeritus
No idea. I've never heard of that before. Sorry!

22. May 22, 2018

### glappkaeft

I agree with Drakkith. The term noise is as commonly, widely and horribly misused in astrophotography to a similar extent of the the common practice of using the kg to measure weight is in everyday usage.

All relevant signals when it comes digital sensors in astronomy are modeled as stochastic processes with a Poisson distribution. This means that the signal is equal to the number of photons/electrons detected by the sensor (this relates to the Nobel Price Einstein did receive on the photoelectric effect) and the noise is thus equal the square root of the signal (in statistics terms the noise is the variance of the signal).

The main three signals are:
1. Bias/offset/pedestal signal - a small constant signal added by the sensor circuits to every exposure to make sure the final number can't become negative.
2. Dark current signal - electrons leaking into the "pixels" over time which is accelerated greatly by higher sensor temperatures. This is why serious astrocameras are both chilled and temperature regulated (to keep the variation down).
3. Light signal - all the actual photons your sensor detects (or in the case of Vigenetting, photons it fails to detect but can be measured/modeled). This includes:
+ The actual object you want to observe
+ Cosmic rays
+ Vigenetting, dust bunnies, etc.
+ Zodiacal light
+ Air glow/Aurora
+ Atmospheric dispersion of light sources
+ Reflections/dispersion due to (nearby or not) light sources reaching the sensor even though they shouldn't due to the properties of the optics
+ Light pollution
+ Aircraft and satellite trails
+ other stuff I forgot to mention

In serious astrocameras the digital output has a "unity" gain, this means that the ADU (Analog-Digital Units), more commonly known as the pixel value, is equal to the the number of electrons detected from all sources. Only the electrons from 3 are actually due to photons and you'd ideally want to avoid having to bother with both the spurious electrons from case 1/2 and many of the photons (you are after all only interested in the photons from the objects you want to observe) from case 3.

There are limits in what we can do to avoid detecting the spurious electrons from 1 and 2 but in most relevant cases 1 only matters if your exposures are too short (and the signal but not the noise can be removed using bias frames) and 2 can be limited by cooling the sensor and if the temperature is steady you can easily remove the signal (but again not the noise) by using dark frame subtraction.

For case 3 some of the signals can be reduced (and thus their inherent noise) by placing the telescope in the right spot (say high and dark), taking the image at the right time (no moon overhead, no aurora), using filters (very narrow-band filters works even from some of the most light polluted areas (I've seen amazing narrow-band images from amateurs in Rome and Athens)), etc. If the signal you want to avoid ever hits the sensor the best you might be able to do is to measure or model it (flat frames, background subtraction, etc.) but then you are again left with removing just the signal and not the noise.

The techniques used to stack sub-exposures (average, median, Sigma-Kappa, etc.) then also have a impact on how the noise and spurious signals are controlled.

Last edited: May 22, 2018
23. May 22, 2018

### davenn

Drakkith and you have said this several times and I really cannot make sense of what you are driving at ??
You are not defining what you mean by the difference in terms of signal and noise
How can you remove signal without removing the noise ? The noise IS a signal, just an unwanted one

24. May 22, 2018

### glappkaeft

Again:
All relevant signals when it comes digital sensors in astronomy are modeled as stochastic processes with a Poisson distribution. This means that the signal is equal to the number of photons/electrons detected by the sensor (this relates to the Nobel Price Einstein did receive on the photoelectric effect) and the noise is thus equal the square root of the signal (in statistics terms the noise is the variance of the signal).

25. May 22, 2018

### glappkaeft

If this is enough you really need to look into the math of signal processing or at least a beginners book on astrophotography like "Making Every Photon Count" or similar. Just like kilograms and Newtons the terms signal and noise has precise technical definitions.

The area involves mostly some specialized university level statistics (statistical processes), some knowledge about the workings of the photo electric effect and the design of digital senors and some rather basic signal processing. Still it is a much to large field to summarize in just a few pages. If you can read Swedish I have a summery on astronet.se of what I perceive as the absolute basic concepts.

Last edited: May 22, 2018