MHB Unsolved statistics questions from other sites....

  • Thread starter Thread starter chisigma
  • Start date Start date
  • Tags Tags
    Statistics
Click For Summary
The discussion addresses unanswered statistics questions from various forums, starting with a question about the normal distribution of the difference between two population variables. It concludes that without the assumption of normality for the populations, one cannot claim that the difference is normally distributed. The thread also explores another question regarding the probability of a circle's center being inside a triangle formed by three random points on its circumference, arriving at a probability of 1/4. Additionally, it examines the expected value and variance of service times for a queue at a post office, confirming that these can be calculated using properties of independent random variables. The overall focus is on clarifying statistical concepts and providing solutions to unresolved queries.
  • #31
chisigma said:
Posted on 05 23 2012 on www.artofproblemsolving.com by the member pablo_roand not yet properly solved…

Let X and Y be two independent random variables with p.d.f. $f_{x}(*)$ and $f_{y}(*)$. Find the p.d.f. of the r.v. U=X Y and V=X/Y...


Let compute first the 'product distribution' of the r.v. U=X Y. Introducing the marginal variable V we can write...

$\displaystyle U=X\ Y, V=Y \implies X= \frac{U}{V}, Y=V $ (1)

... and compute the Jacobian of the (1) we obtain $\displaystyle J= \frac{1}{|V|}$. Now if we indicate with $\displaystyle f_{x,y}(*,*)$ the joined p.d.f. of X and Y we obtain...

$\displaystyle f_{u}(u)= \int_{-\infty}^{+\infty} \frac{1}{|v|}\ f_{x.y} (\frac{u}{v}, v)\ dv$ (2)

If we want the 'ratio distribution' instead of the 'product distribution' we only have to set in (1) $\displaystyle U=\frac{X}{Y} \implies X = U V$ obtaining...

$\displaystyle f_{u}(u)= \int_{-\infty}^{+\infty} |v|\ f_{x.y} (u\ v, v)\ dv$ (3)

Kind regards

$\chi$ $\sigma$
 
Mathematics news on Phys.org
  • #32
Posted the o6 o4 2012 on www.matematicamente.it by stelladinatale [original in Italian…] and not yet solved…

In the time interval (0,t] fellows are born according to a Poisson process with parameter 1. Each fellow has a watch that rings according to a Poisson process with parameter a. When the watch rings the fellow dies. It is requested to demonstrate that the probability that a fellow born in (0,t] is alive at the time t is…

$\displaystyle p(t)= \frac{1-e^{-a\ t}}{a\ t}$

Kind regards

$\chi$ $\sigma$
 
  • #33
chisigma said:
Posted the o6 o4 2012 on www.matematicamente.it by stelladinatale [original in Italian…] and not yet solved…

In the time interval (0,t] fellows are born according to a Poisson process with parameter 1. Each fellow has a watch that rings according to a Poisson process with parameter a. When the watch rings the fellow dies. It is requested to demonstrate that the probability that a fellow born in (0,t] is alive at the time t is…

$\displaystyle p(t)= \frac{1-e^{-a\ t}}{a\ t}$

We know that a fellow was born at the time $\tau$ that is uniformly distributed between 0 and t. The probability that the fellow is alive at the time t is then...

$\displaystyle p(t)= \frac{a}{t}\ \int_{0}^{t} \int_{t-\tau}^{\infty} e^{-a\ x} d \tau\ dx = \frac{1}{t}\ \int_{0}^{t} e^{- a\ (t-\tau)}\ d \tau = \frac{1-e^{-a\ t}}{a\ t}$ (1)

Kind regards

$\chi$ $\sigma$
 
  • #34
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$
 
  • #35
chisigma said:
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$


The wording here is pretty obscure but seems to imply computing the integral using Monte-Carlo methods. In which case we would need a specific value for n.

CB
 
  • #36
chisigma said:
Posted on 05 21 2012 on [FONT=&]www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$

This seems to describe the average distance of a randomly placed $n$-dimensional point in the unit hypercube, from the origin. For what it's worth I had Mathematica churn out the definite integrals for the first few $n$. I was able to get analytical solutions for up to $n = 3$, but the closed-form expression for $n = 3$ is too unwieldy to post here. For $n > 11$, it takes a long time to produce a result:

$\begin{array}{|l|r|}
n &\mathrm{Integral} \\
~ &~ \\
1 & \frac{1}{2} \\
2 &\frac{\sqrt{2} + \sinh^{-1}{1}}{3} = 0.765 \\
3 &0.961 \\
4 &1.12 \\
5 &1.26 \\
6 &1.39 \\
7 &1.50 \\
8 &1.61 \\
9 &1.71 \\
10 &1.81\\
11 &1.90\\
\end{array}$

It seems to grow at a decreasing rate. The integral is relatively well approximated by $\sqrt{\frac{n}{3}}$, and the approximation gets increasingly better as $n$ increases. This result is trivially obtained by taking the square root outside the integral - this is asymptotically valid as $n$ grows large - the Euclidian metric is a poor choice of distance function in high-dimensional space (curse of dimensionality).

I don't see a way to nicely calculate a closed-form expression of the integral with respect to $n$, but I suspect it is possible. That's all I can say though (Sadface)
 
  • #37
Just tp show the power of Monte-Carlo as a method for estimating high order multiple integrals I attach a plot of the integral against order:

View attachment 182

Note 1: the last lable on the horizontal axis has been clipped, it is 10000.

Note 2: the standard error is more or less independednt of n, and is less than 0.01 taking the average over 1000 sample points for each n.

Note 3: Looking at the curve in detail it is possible to detect what appears to be an a deviation from the asymptotic form \( \sqrt{n/3} \), IIRC it looks like \( I_n \approx \sqrt{n/3}-0.03 \) is a better approximation over the range investigated.
 

Attachments

  • MC integral.PNG
    MC integral.PNG
    3.3 KB · Views: 107
Last edited:
  • #38
chisigma said:
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...



Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$

 
  • #39
chisigma said:
Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$



Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB
 
Last edited:
  • #40
CaptainBlack said:
Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB

Well increasing the sample size to 1000000 seems to indicate the above asymptotic form is better yet (over the range checked).

CB
 
  • #41
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$
 
  • #42
chisigma said:
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$

Is their anything to do? Writing out the definition of the expectation and doing one of the integrals should suffice.

CB
 
  • #43
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$
 
  • #44
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$

The number attended in 3 hrs has a Poisson distribution with a mean of 36.

CB
 
  • #45
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...


Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=36$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 36}\ \sum_{n=0}^{40} \frac{36^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e^(-+36)+36^j/+j!,+j=0...+40

... so that is $P_{ow}= .222897574398...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{36}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #46
chisigma said:
Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=32$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 32}\ \sum_{n=0}^{40} \frac{32^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e%5E%28-+32%29+32%5Ej%2F+j%21%2C+j%3D0...+40

... so that is $P_{ow}= .070660852878...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{32}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$

\( 18 \times 2=36 \)

CB
 
  • #47
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...

Kind regards

$\chi$ $\sigma$
 
  • #48
chisigma said:
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...


If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}$ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

Kind regards

$\chi$ $\sigma$
 
  • #49
chisigma said:
If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}= $ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

From the pratical point of view it is esasier to find $\lambda$ as function of $q=P_{off}$ and after operate, if necessary, the substitution $ p=1-q$. The procedure is relatively easy...

$\displaystyle q=\frac{1-e^{-2 \lambda}}{2} \implies 1-2\ q= e^{-2\ \lambda} \implies \lambda= - \ln \sqrt{1-2\ q}= - \ln \sqrt {2\ p-1}$ (1)

... where is $0< q < \frac{1}{2}$ and $\frac{1}{2}< p < 1$...

Kind regards

$\chi$ $\sigma$
 
  • #50
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that f(x)= e-x , x> or = 0 g(y)=3e-3y, y>or = 0 find the probability distribution function of z=x/y how it can be taken forward…

Kind regards

$\chi$ $\sigma$
 
  • #51
chisigma said:
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that f(x)= e-x , x> or = 0 g(y)=3e-3y, y>or = 0 find the probability distribution function of z=x/y how it can be taken forward…

In the post #31 of this thread we found that, if X and Y are r.v. with p.d.f. $\displaystyle f_{x}(*)$ and $\displaystyle f_{y} (*)$ , then the r.v. $U=\frac{X}{Y}$ has p.d.f. ...

$\displaystyle f_{u}(u)= \int_{- \infty}^{+ \infty} |v|\ f_{x\ y} (u v,v)\ dv$ (1)

Now for $f_{x}(x)=e^{-x},\ x>0$ and $f_{y}(y)=3\ e^{-3\ y},\ y>0$ we have...

$\displaystyle f_{u}(u)= \int_{0}^{\infty} v\ e^{-(u+3)\ v}\ dv = \frac{3}{(u+3)^{2}},\ u>0$ (2)

Kind regards

$\chi$ $\sigma$
 
  • #52
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that

$ \displaystyle f(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$

... and...

$\displaystyle g(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$

...find the joint density function z and w where z=xy and w=x…

Kind regards

$\chi$ $\sigma$
 
  • #53
chisigma said:
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that

$ \displaystyle f(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$

... and...

$\displaystyle g(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$

...find the joint density function z and w where z=xy and w=x…


In the post #31 of this thread we found that, if X and Y are r.v. with p.d.f. $\displaystyle f_{x}(*)$ and $\displaystyle f_{y}(*)$, then the r.v. U= X Y has p.d.f. ...

$\displaystyle f_{u}(u)= \int_{- \infty}^{+ \infty} \frac{1}{|v|}\ f_{x,y} (\frac{u}{v}, v)\ dv$ (1)

Now for $\displaystyle f_{x}(x)= \frac{1}{\pi\ (1+x^{2})},\ |x|<1$ and $\displaystyle f_{y}(y)= y\ e^{- \frac{y^{2}}{2}},\ y>0$ we have...

$\displaystyle f_{u} (u)= \frac{1}{\pi}\ \int_{u}^{\infty} \frac{e^{- \frac{v^{2}}{2}}}{1+\frac{u^{2}}{v^{2}}}\ dv$ (2)

The integral (2) however is not very comfortable and some more study is necessary...

Kind regards

$\chi$ $\sigma$
 
  • #54
Posted on 06 15 2012 on www.matematicamente.it by the member Edwavit [original in Italian…] and not yet solved…

Hello boys!... a question I can’t solve: X is a Gaussian r.v. with $\mu = 10$ and $\sigma=4$ and $P(X>C)=.05$. Find C…

Kind regards

$\chi$ $\sigma$
 
  • #55
chisigma said:
Posted on 06 15 2012 on www.matematicamente.it by the member Edwavit [original in Italian…] and not yet solved…

Hello boys!... a question I can’t solve: X is a Gaussian r.v. with $\mu = 10$ and $\sigma=4$ and $P(X>C)=.05$. Find C…

Setting $\frac{C-\mu}{\sigma}=x$ is...

$\displaystyle P\{X<C\}= \frac{1}{2}\ \{1+ \text{erf}\ (\frac{x}{\sqrt{2}})\} = .95$ (1)

... where...

$\displaystyle \text{erf}\ (t)= \frac{2}{\sqrt{\pi}}\ \int_{0}^{t} e^{- \xi^{2}}\ d \xi$ (2)

... so that from (1) we derive...

$\displaystyle x= \sqrt{2}\ \text{erf}\ ^{-1} (.9)$ (3)

Now of course the problem is the computation of the function $\text{erf}\ ^{-1} (*)$. In...

http://www.mathhelpboards.com/threads/1223-erf

… it has been explained how to find the coefficients of the McLaurin expansion of the function $\text{erf}\ ^{-1} (*)$ and the task defined as ‘tedious but not very difficult’. In …

http://mathworld.wolfram.com/InverseErf.html


… we discover that the ‘tedious task’ has been done by somebody some years ago and the result is the series expansion…

$\displaystyle \text{erf}\ ^{-1} (\frac{2\ x}{\sqrt{\pi}}) = \sum_{n=0}^{\infty} a_{n}\ x^{2n+1}$ (4)

... where...

$\displaystyle a_{n}=\frac{c_{n}}{2n+1}$ (5)

... with $c_{n}$ solution of the difference equation...

$\displaystyle c_{n}= \sum_{k=0}^{n-1} \frac{c_{k}\ c_{n-k-1}}{(k=1)\ (2\ k+1)},\ c_{0}=1$ (6)

The first $a_{n}$ are $a_{0}=1$,$a_{1}= \frac{1}{3}$, $a_{2}= \frac {7}{30}$, $a_{3}= \frac{127}{630}$,... The remaining computation is relatively 'comfortable' and are left to the reader...

Kind regards

$\chi$ $\sigma$
 
  • #56
Posted on 06 16 2012 on www.talkstat.com by the member Ramirez and not yet solved…

A light bulb manufacturer advertises that 'the average life of our new light bulb is 50,000 seconds. An immediate adjustment will be made on any bulb that does not last 50,000 seconds'. You purchased four of these bulbs. What is the probability that all four bulbs will last more than 50,000 seconds?...

Kind regards

$\chi$ $\sigma$
 
  • #57
chisigma said:
Posted on 06 16 2012 on www.talkstat.com by the member Ramirez and not yet solved…

A light bulb manufacturer advertises that 'the average life of our new light bulb is 50,000 seconds. An immediate adjustment will be made on any bulb that does not last 50,000 seconds'. You purchased four of these bulbs. What is the probability that all four bulbs will last more than 50,000 seconds?...

The life T of a manufactured article with mean life time $\tau$ has p.d.f. ...

$\displaystyle f(t)= \frac{1}{\tau}\ e^{- \frac{t}{\tau}},\ t>0$ (1)

... so that the probability that the life time is greater than $\tau$ is...

$\displaystyle P\{T>\tau\} = \frac{1}{\tau}\ \int_{\tau}^{\infty} e^{-\frac{t}{\tau}}\ dt = e^{-1}$ (2)

For 4 manufactured articles the probability that all life times are greater than $\tau$ is $P=e^{-4}$...

Kind regards

$\chi$ $\sigma$
 
  • #58
Posted on 06 21 2012 on www.matematicamente.it by the member sairaki87 [original in Italian language...] and not yet solved...

There are two fellows A and B and an urn with 50 white balls and 1 black ball. Alternatively A and B extract a ball and the winner is who extract the black ball. A first extracts. What is the probability for A and B to be the winner?...

Kind regards

$\chi$ $\sigma$
 
  • #59
chisigma said:
Posted on 06 21 2012 on www.matematicamente.it by the member sairaki87 [original in Italian language...] and not yet solved...

There are two fellows A and B and an urn with 50 white balls and 1 black ball. Alternatively A and B extract a ball and the winner is who extract the black ball. A first extracts. What is the probability for A and B to be the winner?...

Of course it is sufficient to compute the probability $P_{A}$ that A is the winner, and the probability that B is the winner is $P_{B}=1-P_{A}$. If n is the overall number of balls [n-1 whites and 1 black...], the probability that the black ball is extracted at the k-th extraction is...

$\displaystyle P_{k}=\frac{n-1}{n}\ \frac{n-2}{n-1}\ ... \frac{n-k+1}{n-k+2}\ \frac{1}{n-k+1}=\frac{1}{n}$ (1)

Now we have two possibilities...

a) k is even so that $\displaystyle P_{A}= \sum_{k\ \text{odd}} \frac{1}{n}= \frac{1}{2}$

b) k is odd so that $\displaystyle P_{A}= \sum_{k\ \text{odd}} \frac{1}{n}= \frac{n+1}{2\ n}$

Kind regards

$\chi$ $\sigma$
 
  • #60
Posted on 04 23 2012 on http://www.scienzematematiche.it/ by the user whitefang [original in Italian…] and not yet properly solved…

We are shooting at a target over a two-dimension plane. The horizontal and vertical distances of the hits respect to the target are normal r.v. with $\mu=0$ and $\sigma=4$. D is the distance between the hit and the target. Find $E\{D\}$...

Kind regards

$\chi$ $\sigma$
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 133 ·
5
Replies
133
Views
31K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
16K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K