- #1

GhostLoveScore

- 132

- 6

I've made a small program that does FFT, Fast Fourier Transform on a signal from radio receiver (RTL SDR), and parabolic antenna. Of course, the output is amplitude depending on frequency.

I want to use it to detect and measure 1420MHz radiation from space. But I'm not sure on what's the best way to do it. I know the intensity is proportional to amplitude squared, and intensity I= P/A., where I is intensity, P is power received, and A is area of my antenna.

A common unit in radio astronomy seems to be Jansky (flux density) which is proportional to P/(A*bw) where P is power received, A is area of the receiving antenna and bw is bandwidth of your receiving system.

So, I'm a bit confused... My assumption is that I should measure what my antenna is outputting in Janskys. (is that correct?)

That means:

flux density= I/bw=V^2/bw, where V is output from my receiver. But since I'm doing a FFT over a range of frequencies (about 2MHz), what should I do here? How can I calculate flux density from my FFT result? Integrate the amplitudes over the frequency range?? Something else?