- #1
fahraynk
- 186
- 6
Suppose I am trying to approximate a function which I do not know, but I can measure. Each measurement takes a lot of effort.
Say the function I am approximating is ##y=f(x)## and ##x \in [0,100]##
Supose I know the expectation and variance of ##f(x)##.
Is there a way to compute the confidence in regression approximation as a function of sample size?
Is there a maximum number of useful measurements? For example, if I take 20 measurements between ##x=1## and ##x=2##, and 1 measurement at ##x=20##, another measurement at ##x=2## won't really tell me any new information right? So shouldn't there be a maximum number of measurements, or a minimum distance between measurements based upon the standard deviation of ##f(x)##?
Say the function I am approximating is ##y=f(x)## and ##x \in [0,100]##
Supose I know the expectation and variance of ##f(x)##.
Is there a way to compute the confidence in regression approximation as a function of sample size?
Is there a maximum number of useful measurements? For example, if I take 20 measurements between ##x=1## and ##x=2##, and 1 measurement at ##x=20##, another measurement at ##x=2## won't really tell me any new information right? So shouldn't there be a maximum number of measurements, or a minimum distance between measurements based upon the standard deviation of ##f(x)##?
Last edited: