Probability: ranges in Gaussian approximation

In summary, the conversation is about a probability question involving selecting a counter from an opaque bag and calculating the probability of obtaining scores within certain ranges. The variance and mean are given as 2.22 and 1.66 respectively, and the ranges are from 0.66 to 2.32 and from 0.17 to 3.17. The use of the Gaussian approximation of the binomial distribution is also discussed. The person is seeking help with the remaining steps of the problem.
  • #1
mmh37
59
0
Hello everyone,

I got stuck on a probability question and would be very thankful if someone could give me a hint:

An Opaque bag contains 10 green counters and 20 red. One couner is selected at random and then replaced: green scores 1 and red scores zero.

1) Calculate the probability of obtaining scores in the ranges <r> +/- 0.5* root(var(x)) and <r> +/- root(var(x))


that doesn't seem too bad. for the variance and the mean I got 2.22 and 1.66 respectively, so the ranges are:

a) from 0.66 to 2.32, so I just add P(1) + P(2)= 0.67

b) from .17 to 3.17, so this is P = P(1) + P(2) + P(3)= .823



2) The Gaussian approximation of the binomial distribuion in (1) is given as P(r) = exp [-9(r-5/3)^2/20]; now do the same as in 1 and compare the answers. In what sense is P(r) a good approximation to p?

OK, so I don't know how to go on from here. Help's very mcuh appreciated!
 
Physics news on Phys.org
  • #2
Can anyone help?
 

1. What is the definition of probability in a Gaussian approximation?

The probability in a Gaussian approximation refers to the likelihood of a certain value falling within a range of a normal distribution. It is typically represented by the area under the curve of the normal distribution within that range.

2. How is the range of probability determined in a Gaussian approximation?

The range of probability in a Gaussian approximation is determined by the standard deviation of the normal distribution. The larger the standard deviation, the wider the range of probability.

3. What is the difference between probability and confidence intervals in a Gaussian approximation?

Probability refers to the likelihood of a value falling within a certain range, while confidence intervals refer to the range in which a population parameter is likely to fall within a certain level of confidence.

4. How is a Gaussian approximation used in real-world applications?

A Gaussian approximation is commonly used in statistics and data analysis to approximate the distribution of real-world data. It is also used in various fields such as finance, engineering, and physics to model and analyze data.

5. What are the limitations of using a Gaussian approximation for probability?

One major limitation of using a Gaussian approximation for probability is that it assumes a normal distribution, which may not always be the case in real-world data. Additionally, it may not accurately capture extreme values or outliers in the data.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
824
  • Special and General Relativity
Replies
1
Views
499
Replies
96
Views
9K
  • Calculus and Beyond Homework Help
Replies
2
Views
7K
  • Introductory Physics Homework Help
Replies
14
Views
2K
  • Precalculus Mathematics Homework Help
Replies
5
Views
4K
Back
Top