- #1
ForceBoard
- 4
- 0
Hello,
I've been trying to find the answer to this question on the internet but no real luck so here goes:
Imagine a signal My_signal which has two components, the actual signal A which is not constant over time and a noise component, we can call it N, where said noise component has a standard deviation which is more or less constant over time regardless of the amplitude of A. How do I calculate the standard deviation for A?
If sigma is the standard deviation, is it:
My_signal = A + N
sigma(My_signal) = sqrt( (sigma(A)^2) + (sigma(N)^2))
-> sigma(A) = sqrt (sigma(My_signal)^2-sigma(N)^2))Thanks,
Marcus
I've been trying to find the answer to this question on the internet but no real luck so here goes:
Imagine a signal My_signal which has two components, the actual signal A which is not constant over time and a noise component, we can call it N, where said noise component has a standard deviation which is more or less constant over time regardless of the amplitude of A. How do I calculate the standard deviation for A?
If sigma is the standard deviation, is it:
My_signal = A + N
sigma(My_signal) = sqrt( (sigma(A)^2) + (sigma(N)^2))
-> sigma(A) = sqrt (sigma(My_signal)^2-sigma(N)^2))Thanks,
Marcus