Why did they choose to estimate the parameter this way.

In summary, the conversation discusses two different methods for estimating the value of X given Y in a joint distribution problem. One method involves finding the expected value of X given Y and plugging in the value of Y to get an estimate for X. The other method involves finding the opposite distribution, Y given X, and using an unbiased estimator to estimate X given Y. It is difficult to determine which method is better without more information and testing on real data.
  • #1
bobby2k
127
2
Hello, I am taking an introduction class to statistics, in an earlier exam excercise they choose to estimate the answer in an odd way. To make my question concise I will make an abstraction of the problem, and not go into the details, but I can give you them if you'd like aswell.

In the problem we have two jointly distribution variables X and Y. They then draw a Y value, and they want to estimate the X value they got.

The way they do this is by first finding the distribution of X given Y. That is
[itex]P(X=x|Y=y^{*})=f_{1}(x,y^{*})[/itex]
They then find the exptected value of X given Y:
[itex]E(X|Y=y^{*})=f_{2}(y^{*})[/itex]

When they then draw the value Y they just plug it into the function [itex]f_{2}(y^{*})[/itex], and say that this is en estimate for x, why does this work? When we estimate the mean of a normal distribution for instance, we do not calculate en exptected value when we want to estimate it.

I am wondering if this other way of estimating X is better or worse(or even correct)?:
We can get the opposite distrubution, that is Y given X:

[itex]P(Y=y|X=x^{*})=f_{3}(y,x^{*})[/itex]
Then we get an estimator:[itex] \hat{X}^{*}(y)[/itex], and if this estimator is unbiased, we plug the value of Y into this, and say that this is the estimation for x. This way looks more like the "normal" way,like when we estimate the mean in the normal distribution.

What do you think? Is this explained somewhere?
 
Physics news on Phys.org
  • #2
It sounds like the method you are suggesting would be more appropriate for estimating X given Y. When we estimate the mean of a normal distribution, we use the expected value of the normal distribution to calculate the mean. This is similar to what you are suggesting when you suggest using the estimator \hat{X}^{*}(y). The estimator is the expected value of X given Y, so by plugging in the value of Y, you are finding the expected value of X given Y. It is difficult to say whether one method is better or worse than the other without more information about the specifics of the problem. Generally, the estimator \hat{X}^{*}(y) is likely to give a more accurate estimation of X given Y than simply plugging the value of Y into f_2(y*). It may also be worth comparing the two methods on real data to see which one yields the most accurate results.
 

1. Why is estimating the parameter important in scientific research?

Estimating the parameter is important because it allows us to make predictions and draw conclusions about a larger population based on a smaller sample. It also helps us to understand the relationships and patterns within the data.

2. How do scientists determine the best way to estimate a parameter?

Scientists typically use statistical methods to determine the best way to estimate a parameter. This involves analyzing the data and selecting a method that is most appropriate for the type of data and the research question.

3. What factors influence the choice of the parameter estimation method?

The choice of the parameter estimation method depends on various factors such as the type of data, the research question, the sample size, and the assumptions of the statistical model being used. Other factors may include the level of accuracy required and the resources available.

4. Can different researchers choose different methods to estimate the same parameter?

Yes, different researchers may choose different methods to estimate the same parameter. This can be due to differences in their research objectives, their expertise in different statistical methods, or their personal preferences.

5. How can we ensure the accuracy and validity of the estimated parameter?

To ensure the accuracy and validity of the estimated parameter, scientists use various methods such as conducting multiple tests, using appropriate statistical techniques, and validating the results with other data sources. It is also important to clearly communicate the method used and any assumptions made during the estimation process.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
456
  • Calculus and Beyond Homework Help
Replies
8
Views
466
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
430
  • Calculus and Beyond Homework Help
Replies
3
Views
817
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
Back
Top