- #1

- 155

- 0

## Main Question or Discussion Point

Suppose I have a radar gun that can measure velocity with an internal error with a mean of 3 m/s and a standard deviation of 1m/s on that error

eg velocity

10 +- 2.8

6 +- 3.1

21 +- 3.2

and so on. Now I want to make a prediction of the future when I get a new radar gun. It will have an internal error of roughly 1m/s

How then do I scale the std deviation? by a factor of three?

Thanks all!

eg velocity

10 +- 2.8

6 +- 3.1

21 +- 3.2

and so on. Now I want to make a prediction of the future when I get a new radar gun. It will have an internal error of roughly 1m/s

How then do I scale the std deviation? by a factor of three?

Thanks all!

Last edited: