Unbiased estimator for exponential dist.

  • Thread starter bennyska
  • Start date
  • Tags
    Exponential
In summary: I'm not sure if that is always the case. If you can show me an example where that is not the case, I might be able to help you out more.
  • #1
bennyska
112
0

Homework Statement


let X1, X2,... Xn form a random sample of size n from the exponential distribution whose pdf if f(x|B) = Be-Bx for x>0 and B>0. Find an unbiased estimator of B.


Homework Equations





The Attempt at a Solution


nothing yet. i don't really know where to get started. a push in the right direction would be greatly appreciated.
i'm not really super clear on how to calculate unbiased estimators. i believe i want a parameter, Bhat, such that E(Bhat) = B. but I'm not really sure how to find it. so yeah, a general hint on how to get started would be awesome. thanks!
 
Physics news on Phys.org
  • #2
so first assume you know B, the probability of getting X1
[tex] f(X_1|B) = Be^{-B X_1} [/tex]

then for a few random variables, the probability of a sequence X1, X2...,Xn given B is:
[tex] f(X_1|B)f(X_2|B)..f(X_n|B)
= Be^{-B X_1}Be^{-B X_2}..Be^{-B X_n}
= B^n e^{-B( X_1+X_2 +...+X_n)}
[/tex]

consider maximising this relative to B, this yields the MLE (maximum likelihood estimator) for B. you can then check then whether it is biased and consider how to alter it to remove bias if required...
 
Last edited:
  • #3
thanks. is this the normal way of finding unbiased estimators, using a known estimator, such as an MLE, and then checking to see if it's biased, or alter it as needed with a constant?
 
  • #4
not too sure, I've only played with MLEs & bias a little, but seems like a reasonable approach & would work for things like sample variance etc.

probably wortha crack and see if you cna get it to work
 
Last edited:
  • #5
so if i did my math right, i got B. does it make sense to say B is an unbiased estimator for B?
 
  • #6
not really... you don't know B.. you want to get to some estimator for B in terms of the observations [itex] \hat{B} = f(X_1, X_2, .., x_n) [/itex]

show your working
 
  • #7
so i take that function you came up, i guess replace B with B', take the natural log, take the derivative, set it to 0, and get B'=1/xbar (sample average) for my mle, where as B=1/u, where u = true average.
so when i take E(1/xbar), i get the B/xbar*integral(e-Bxdx), and i get B/xbar * -1/B*e-Bx=-B/Bxbar*e-Bx= -1/xbar*e-Bx evaluated from 0 to infinity, which is also throwing me off, because x>0. but when i do that, i get E(1/xbar)=(1/xbar). does that seem right?
 
  • #8
ok so let me check this. taking log of the likelihood function gives
[tex] ln (f(X_1|B)f(X_2|B)..f(X_n|B))
[/tex]
[tex]
= ln(f(X_1|B)) + ln(f(X_2|B))+..
[/tex]
[tex]
= ln(B) + ln(e^{-B X_1}) + ln(B) + ln(e^{-B X_2})+..
[/tex]
[tex]
= n.ln(B) + ln(e^{-B X_1}) + ln(e^{-B X_2})+..
[/tex]
[tex]
= n.ln(B) -B X_1 -B X_2 -...-B X_n
[/tex]

then differntiating w.r.t. B gives
[tex] n/B -X_1 -X_2 -...-X_n [/tex]

eqauting to zero for our MLE estimator gives
[tex] \hat{B} = \frac{n}{\sum_{i=1}^n X_i}[/tex]

which is one on the sample average as you say
 
Last edited:
  • #10
[tex]E(\frac{1}{\bar{x}})=\int\frac{1}{\bar{x}}\beta e^{-\beta x}dx=\frac{\beta}{\overline{x}}\int e^{-\beta x}dx=\frac{\beta}{\bar{x}}\frac{-1}{\beta}e^{-\beta x}|_{0}^{\infty}=\frac{-\beta}{\bar{x}\beta}e^{-\beta x}|_{0}^{\infty}=\frac{-1}{\bar{x}}e^{-\infty}-\frac{-1}{\bar{x}}e^{0}=0+\frac{1}{\bar{x}}=\frac{1}{\bar{x}}=\hat{\beta}[/tex]

something like that?

now, in order for this to be unbiased, does it need to be identically equal to [tex]\beta[/tex]?
 
  • #11
ok so first probability distribution p(x)
[tex]
p(x=X) = \beta e^{\beta x}
[/tex]

then the mean is, after some integration by parts
[tex](v=x, du=\beta e^{-\beta x})[/tex]
[tex](dv=1, u=-e^{-\beta x})[/tex]
[tex]\int v.du = u.v-\intdv.u[/tex][tex]
\mu = E[X]
=\int_0^{\infty} dx .x. \beta e^{-\beta x} [/tex]
[tex]= (-x e^{-\beta x})|_0^{\infty} - \int_0^{\infty} dx -e^{-\beta x} [/tex]
[tex]= (0-0)- (-\frac{1}{\beta} e^{-\beta x} )|_0^{\infty} = \frac{1}{\beta} [/tex]

which gives us a bit of confidence
 
Last edited:
  • #12
bennyska said:
[tex]E(\frac{1}{\bar{x}})=\int\frac{1}{\bar{x}}\beta e^{-\beta x}dx=\frac{\beta}{\overline{x}}\int e^{-\beta x}dx=\frac{\beta}{\bar{x}}\frac{-1}{\beta}e^{-\beta x}|_{0}^{\infty}=\frac{-\beta}{\bar{x}\beta}e^{-\beta x}|_{0}^{\infty}=\frac{-1}{\bar{x}}e^{-\infty}-\frac{-1}{\bar{x}}e^{0}=0+\frac{1}{\bar{x}}=\frac{1}{\bar{x}}=\hat{\beta}[/tex]

something like that?

now, in order for this to be unbiased, does it need to be identically equal to [tex]\beta[/tex]?

I'm pretty sure you can't just take the sample variance outside the integral like that as it is a function of random variables, over which space you are integrating
 
  • #13
now interestingly, the following integral does not converge
[tex] E[\hat{\beta(X_1)}] = E[1/X_1] = E[1/X] =\int_0^{\infty} dx .\frac{1}{x}. \beta e^{-\beta x} = \infty[/tex]

which hints that something is not quite right with our estimator... i think it has infinite bias for the n=1 case?

though it agrees with a wiki search, that doesn't mention bias
http://en.wikipedia.org/wiki/Exponential_distribution#Parameter_estimation
 
Last edited:
  • #14
ok so i checked back to my text and is you were to estimmatte 1/beta using the sample mean it would be unbiased

however estimating beta does lead to a bias, and the unbiased estimator is in fact
[tex] \hat{\beta} = \frac{n}{n-1}\frac{1}{\bar{X}}[/tex]
note it is singluar for n = 1

now it didn't have a derivation, but i think the start would be to use the sum of exponential variables, which will be a convolution to derive the distribution of
[tex] \bar{ X }[/tex]

then use that to find the expectation
[tex] E(\frac{1}{ \bar{X}} )[/tex]
 
  • #15
so if we let Y = X_1 +X_2, then we have
[tex]
p(y=Y) = \int \int dxdy p(x)p(y) \delta(z-x-y) [/tex]
[tex]
= \int_0^y dx p(x)p(y-x) [/tex]
[tex]
= \int_0^y dx \beta^2 e^{-\beta(x-x+y)} [/tex]
[tex]
= \int_0^y dx \beta^2 e^{-\beta y} [/tex]
[tex]
= \beta^2 x(e^{-\beta y})|_0^y [/tex]
[tex]
= \beta^2 y e^{-\beta y}
[/tex]

hopefully you can generalise form here for n samples...
 
Last edited:

1. What is an unbiased estimator for an exponential distribution?

An unbiased estimator for an exponential distribution is a statistical method that accurately estimates the parameters of an exponential distribution without any systematic bias. This means that on average, the estimator will produce estimates that are equal to the true values of the parameters.

2. How is an unbiased estimator for an exponential distribution calculated?

There are several methods for calculating an unbiased estimator for an exponential distribution, including the method of moments, maximum likelihood estimation, and least squares estimation. These methods involve using sample data to estimate the parameters of the exponential distribution and adjusting the estimates to eliminate any bias.

3. What is the purpose of using an unbiased estimator for an exponential distribution?

The purpose of using an unbiased estimator for an exponential distribution is to obtain accurate estimates of the true parameters of the distribution. This is important for making accurate predictions and inferences based on the data.

4. Can an unbiased estimator for an exponential distribution produce biased estimates?

Yes, it is possible for an unbiased estimator to produce biased estimates. This can happen if the sample data used to calculate the estimator is not representative of the population or if the assumptions of the method used are not met.

5. How can you determine if an estimator for an exponential distribution is unbiased?

To determine if an estimator for an exponential distribution is unbiased, you can compare the average estimate to the true value of the parameter. If the average estimate is equal to the true value, then the estimator is unbiased. This can also be confirmed through mathematical proofs and simulations.

Similar threads

  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
996
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
3K
  • Calculus and Beyond Homework Help
Replies
12
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
Back
Top