- #1
Mishra
- 55
- 1
Hello,
I am trying to find an expression for the signal-to-noise ratio of an oscillating signal on top some white noise. In particular I would like to know how the SNR scales with the integration time. It is well known that during some integration time ##T##, the SNR increases as ##T^{1/2}## (because the noise increases as ##T^{1/2}## and the signal increases at ##T##). I am trying to prove this with, my limited skills in maths, using the power spectral density formalism.
The integration time is ##T##. I assume some signal: ##x(t)=A \sin(2 \pi \omega_0 t)## on top of some white noise of power spectral density ##S##. I am guessing that I should compare the power spectral density of the noise vs signal:
$$P(f)=X(f)^2$$
where ##X(f)## is the truncated Fourier transform:
$$X(f)=\frac{1}{\sqrt{T}}\int_0 ^{T} x(t) e^{-i \omega t} dt$$
In the limit where ##T## is large, the integral is a Fourier tranform, yielding a delta function:
$$X(f)=\frac{A}{\sqrt{T}} \delta (\omega - \omega_0)$$
Ence:
$$P(f)=\frac{A^2}{ T } \delta (\omega - \omega_0)$$
$$P(f_0)=\frac{A^2}{ T }$$
Here I have that the power spectral density at the oscillation frequency decreases linearly in time. Can someone explain me where my misunderstanding is please ?
I am trying to find an expression for the signal-to-noise ratio of an oscillating signal on top some white noise. In particular I would like to know how the SNR scales with the integration time. It is well known that during some integration time ##T##, the SNR increases as ##T^{1/2}## (because the noise increases as ##T^{1/2}## and the signal increases at ##T##). I am trying to prove this with, my limited skills in maths, using the power spectral density formalism.
The integration time is ##T##. I assume some signal: ##x(t)=A \sin(2 \pi \omega_0 t)## on top of some white noise of power spectral density ##S##. I am guessing that I should compare the power spectral density of the noise vs signal:
$$P(f)=X(f)^2$$
where ##X(f)## is the truncated Fourier transform:
$$X(f)=\frac{1}{\sqrt{T}}\int_0 ^{T} x(t) e^{-i \omega t} dt$$
In the limit where ##T## is large, the integral is a Fourier tranform, yielding a delta function:
$$X(f)=\frac{A}{\sqrt{T}} \delta (\omega - \omega_0)$$
Ence:
$$P(f)=\frac{A^2}{ T } \delta (\omega - \omega_0)$$
$$P(f_0)=\frac{A^2}{ T }$$
Here I have that the power spectral density at the oscillation frequency decreases linearly in time. Can someone explain me where my misunderstanding is please ?