MHB Unsolved statistics questions from other sites....

  • Thread starter Thread starter chisigma
  • Start date Start date
  • Tags Tags
    Statistics
chisigma
Gold Member
MHB
Messages
1,627
Reaction score
0
This thread is opened by me with the purpose to answer to statistic questions proposed in other sites that didn't receive answer for not less that three days. Let's start from that question posted on mathhelpforum.com by Frida on 04 22 2012...

A variable of two populations has a mean of 7.9 and a standard deviation of 5.4 for one of the populations and a mean of 7.1 and a standard deviation of 4.6 for the other populations Can you conclude that the variable x1-x2 is normally distributed and why?...

How to answer to Frida?...

Kind regards

$\chi$ $\sigma$
 
Mathematics news on Phys.org
chisigma said:
This thread is opened by me with the purpose to answer to statistic questions proposed in other sites that didn't receive answer for not less that three days. Let's start from that question posted on mathhelpforum.com by Frida on 04 22 2012...

A variable of two populations has a mean of 7.9 and a standard deviation of 5.4 for one of the populations and a mean of 7.1 and a standard deviation of 4.6 for the other populations Can you conclude that the variable x1-x2 is normally distributed and why?...

How to answer to Frida?...

Kind regards

$\chi$ $\sigma$

No, not as it stands. Even now I find myself interpolating information that is not included in the question, like x1 and x2 are drawn from the first and second populations respectivly.

You can see that this is the case if in population 1 we value 7.9-5.4 with probability 0.5 and value 7.9+5.4 with probability 0.5, and population 2 we have 7.1-4.6 with probabilty 0.5 and 7.1+4.6 with probability 0.5. Then x1-x2 is discrete and so not normal (and the same sort of trick but a bit more subtle can be played if we insist on continuous distributions).

More likely is that the OP has missed the stipulation that the two populations be normal, when the answer is yes.

CB
 
If one indicates mean and standard deviation of one random variable and nothing other, probably he implies that the random variable is normal distributed. If that is true, the $X_{1}$ and $X_{2}$ have p.d.f. ...

$\displaystyle f_{1}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{1}}\ e^{- \frac{(x - \mu_{1})^{2}}{2\ \sigma_{1}^{2}}}$

$\displaystyle f_{2}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{2}}\ e^{- \frac{(x - \mu_{2})^{2}}{2\ \sigma_{2}^{2}}}$ (1)

If we define a new random variable $X=X_{1}+X_{2}$, the X has p.d.f. $f(x)=f_{1}(x)\ *\ f_{2}(x)$ , when '*' means 'convolution'. Applying a basic property of Fourier Transform, if we set...

$\displaystyle F_{1}(\omega)= \mathcal {F} \{f_{1}(x)\} = e^{-i\ 2\ \pi\ \mu_{1}\ \omega}\ e^{-2\ \sigma_{1}^{2}\ \omega^{2}}$

$\displaystyle F_{2}(\omega)= \mathcal {F} \{f_{2}(x)\} = e^{-i\ 2\ \pi\ \mu_{2}\ \omega}\ e^{-2\ \sigma_{2}^{2}\ \omega^{2}}$ (2)

... is...

$\displaystyle \mathcal {F} \{f(x)\} = F_{1}(\omega)\ F_{2}(\omega) = e^{-i\ 2\ \pi\ (\mu_{1}+ \mu_{2} )\ \omega}\ e^{-2\ (\sigma_{1}^{2}+\sigma_{2}^{2})\ \omega^{2}}$ (3)

From (3) it is clear that X is a normal random variable with mean $\mu=\mu_{1}+\mu_{2}$ and standard deviation $\sigma= \sqrt{\sigma_{1}^{2}+\sigma_{2}^{2}}$. In Frida's example is $\mu_{1}=7.9$, $\mu_{2}=-7.1$, $\sigma_{1}=5.4$, $\sigma_{2}=4.6$. so that is $\mu=.8$ and $\sigma=7.09$. The extension to the case of number of r.v. greater than two is simple and is left to the reader...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
If one indicates mean and standard deviation of one random variable and nothing other, probably he implies that the random variable is normal distributed.

That implication is not acceptable, it gets people into bad habits like assuming you can word a question carelessly and still expect a correct answer.

We must tell posters to post the question as asked, or answer the question as asked and point out this is probably not what they really wanted to ask. Especially in this case like the one here, where the question asks if a particular conclusion is valid, the answer to the question as asked is no.

CB
 
Posted the 04 15 2012 in the Italian site www.matematicamente.it f by the member lutteo2000 [original in Italian language...] and not yet solved...

On a circumference three points A,B and C are randomly chosen. What's the probability that the center O of the circumference is internal to the triangle ABC?...

Certainly lutteo2000 has very well formulated his question!... how to answer?...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
Posted the 04 15 2012 in the Italian site www.matematicamente.it f by the member lutteo2000 [original in Italian language...] and not yet solved...

On a circumference three points A,B and C are randomly chosen. What's the probability that the center O of the circumference is internal to the triangle ABC?...

Certainly lutteo2000 has very well formulated his question!... how to answer?...

Kind regards

$\chi$ $\sigma$

Well I have a method, but it is difficult to explain without being able to sketch, but:

The position of the first point A can be taken to be (in polars) \(\theta=0\), and if we know the smaller of the angles between the first and second point is \(\phi\) the probability that the third forms a triangle that enclosed the centre of the circle is \(\phi / (2 \pi) \) (this last bit is where a sketch is needed). Then as \( \phi \) is \( \sim U(0, \pi) \) the required probability is:

\[P=\int_0^{\pi} \frac{\phi}{2 \pi} \frac{1}{ \pi}\; d\phi=1/4\]

This result is supported by a simple Monte-Carlo calculation without simplifying assumptions.

CB
 
Last edited:
chisigma said:
Posted the 04 15 2012 in the Italian site www.matematicamente.it f by the member lutteo2000 [original in Italian language...] and not yet solved...

On a circumference three points A,B and C are randomly chosen. What's the probability that the center O of the circumference is internal to the triangle ABC?...

Certainly lutteo2000 has very well formulated his question!... how to answer?...

The solution is surprisingly simple if one uses some tricks...

a) first we suppose that the circle is the unit disk and each point is defined by an angle...

b) second it is not a limitation to suppose that the first point is the point [1,0]...

c) third we normalize the angles to $\pi$ so that the second and third points are random variables uniformely distributed between 0 and 2...

Under these hypotheses, indicating with X and Y the angles of the second and third points, with simple geometric considerations one finds that the requested probability is...

$\displaystyle P=\frac{1}{4}\ \int_{0}^{1} dx\ \int_{x}^{1+x} dy= \frac{1}{4}$

Kind regards

$\chi$ $\sigma$
 
Posted on 03 20 2012 on http://www.scienzematematiche.it by the member fry [original in Italian language...] and not yet properly solved...

Let be $X_{n},\ n=1,2,...$ a sequence of [binary] independent random variables with $\displaystyle P\{X_{n}=0\}=P\{X_{n}=1\}=\frac{1}{2}$ and let's define the R.V. $\displaystyle Y=\sum_{n=1}^{\infty} \frac{X_{n}}{2^{n}}$. How to demonstrate that Y is uniformely distributed between 0 and 1?...

For completeness sake I would suggest to analyse the more general case $\displaystyle P\{X_{n}=0\}=p,\ 0<p<1$...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
Posted on 03 20 2012 on http://www.scienzematematiche.it/ by the member fry [original in Italian language...] and not yet properly solved...

Let be $X_{n},\ n=1,2,...$ a sequence of [binary] independent random variables with $\displaystyle P\{X_{n}=0\}=P\{X_{n}=1\}=\frac{1}{2}$ and let's define the R.V. $\displaystyle Y=\sum_{n=1}^{\infty} \frac{X_{n}}{2^{n}}$. How to demonstrate that Y is uniformely distributed between 0 and 1?...

The demonstration I will give is based on the Fourier Transform and on the so called 'Vieta's product'. In the sixteenth century the French mathematician Francois Vieta was able to solve [first in Wenstern...] a non trivial infinite product. He started iteratively applying n times the trigonometric relationship...

$\displaystyle \sin 2\alpha=2\ \sin \alpha\ \cos \alpha$ (1)

… and obtaining…

$\displaystyle \sin\ 2^{n}\ \alpha = 2^{n}\ \sin \alpha\ \cos \alpha\ \cos^{2} \alpha\ …\cos 2^{n-1} \alpha$ (2)

Setting in (2) $\displaystyle \theta= 2^{n} \alpha$ he obtained first…

$\displaystyle \sin \theta= 2^{n}\ \sin \frac{\theta}{2^{n}}\ \cos \frac{\theta}{2^{n}}\ \cos \frac{\theta}{2^{n-1}}\ …\ \cos \frac{\theta}{2}$ (3)

… and from (3)…

$\displaystyle\frac{\sin \theta}{\theta}=\frac{\sin \frac{ \theta}{2^{n}}}{\frac{\theta}{2^{n}}}\ \prod_{k=1}^{n} \cos \frac{\theta}{2^{k}}$ (4)

… and finally leaving n to tend to infinity he obtained…

$\displaystyle \frac{\sin \theta}{\theta}= \prod_{k=1}^{\infty} \cos \frac{\theta}{2^{k}}$ (5)

… a very remarkable result for the sixteenth century!...

Now we consider that each $X_{n}$ has [discrete] p.d.f. $\displaystyle f_{n}(x)= \frac{\delta(x)+ \delta(x-\frac{1}{2^{n}})}{2}$, the Fourier Transform of which is…

$\displaystyle F_{n}(\omega)= \mathcal{F} \{f_{n}(x)\}= \frac{1}{2}\ (1+e^{- i\ \frac{\omega}{2^{n}}})= e^{-i\ \frac{\omega}{2^{n+1}}}\ \cos \frac{\omega}{2^{n+1}}$ (6)

From (6) it is possible to derive the F.T. of the p.d.f. of Y applying the convolution theorem and the (5)…

$\displaystyle F(\omega)= \prod_{n=1}^{\infty}F_{n}(\omega)= \prod_{n=1}^{\infty} (e^{-i\ \frac{\omega}{2^{n+1}}}\ \cos\frac{\omega}{2^{n+1}})= e^{-i\ \omega\ \sum_{n>0} \frac{1}{2^{n+1}}}\ \prod_{n=1}^{\infty} \frac{\cos \omega}{2^{2n+1}}= e^{-i\ \frac{\omega}{2}}\ \frac{\sin \frac{\omega}{2}}{\frac{\omega}{2}} $ (7)

... and performing the inverse F.T. of (7) we obtain finally...

$\displaystyle f(x)=\mathcal{F}^{-1}\{F(\omega)\}= \mathcal{U}(x)-\mathcal{U}(x-1)$ (8)

Kind regards

$\chi$ $\sigma$


 
  • #10
Posted on 04 19 2012 on www.matematicamente.it by the member caporock [original in Italian language...] and not yet properly solved...

In a post office there is a queue of five waiting persons in front of the window. The service time $T$ is a R.V. uniformly distributed from 0 to 1 minute. Let be $T_{f}$ the time required to serve the five persons. Compute expected value and variance of $T_{f}$...

Kind regards

$\chi$ $\sigma$
 
  • #11
chisigma said:
Posted on 04 19 2012 on www.matematicamente.it by the member caporock [original in Italian language...] and not yet properly solved...

In a post office there is a queue of five waiting persons in front of the window. The service time $T$ is a R.V. uniformly distributed from 0 to 1 minute. Let be $T_{f}$ the time required to serve the five persons. Compute expected value and variance of $T_{f}$...

Kind regards

$\chi$ $\sigma$

\(T_f\) is the sum of \(5\) \(U(0,1)\) RV, so its mean is \(5 \times 0.5\), and its variance is \(5 \times \frac{1}{12}\)

That is means and variances sum for independednt RVs (note I have assumed service times are independent)

And just to show off we can do a Monte-Carlo experiment to show this:

Code:
>N=1000000;
>tt=random(N,5);  ..generate N 5 person service times ~U(0,1)
>
>tf=(sum(tt))';   .. compute N total service times for 5 person queues 
>
>{m,s}=meandev(tf);[m,s^2], .. compute mean and variance
      2.50008      0.416111 
>
>[5*0.5,5/12],   .. reference values
          2.5      0.416667 
>

CB
 
Last edited:
  • #12
Does the service time to serve each persons iid? Is that what uniform distribution?

If yes, exp(Tf)=5*exp(T)=2.5min
and var(Tf)=5*var(T)=5/12 min.

But i doubt it is not the Qn.Sorry, i didn't see the prev. post
 
  • #13
Posted on 11 22 2011 on http://www.scienzematematiche.it [original in Italian...]...

What is the probability P that the distance between two random points inside an n-sphere of radious 1 is less than 1?...

I suggest to start with n=2, i.e. the 'sphere' is a circle...

Kind regards

$\chi$ $\sigma$
 
  • #14
chisigma said:
Posted on 11 22 2011 on http://www.scienzematematiche.it [original in Italian...]...

What is the probability P that the distance between two random points inside an n-sphere of radious 1 is less than 1?...

I suggest to start with n=2, i.e. the 'sphere' is a circle...



Clearly it is not limitative to suppose that the first point $x_{0}$ and in this case, as illustrated in the figure...
https://www.physicsforums.com/attachments/170._xfImport

... the probability that, setting $x_{1}$ the second random point, is $|x_{1}-x_{0}|<1$ is the ratio between the two circular segments separated by the line $x=\frac{x_{0}}{2}$ and the area of the unit circle, i.e. ...

$\displaystyle P\{|x_{1}-x_{0}|<1\} = \frac{2}{\pi}\ (\cos ^{-1} \frac{x_{0}}{2} - \frac{x_{0}}{2}\ \sqrt{1-\frac{x_{0}^{2}}{4}})$ (1)

But $x_{0}$ is uniformly distributed from 0 to 1, so that the requested probability is...

$\displaystyle P= \frac{4}{\pi} \int_{0}^{\frac{1}{2}} (\cos^{-1} x -x\ \sqrt{1-x^{2}})\ dx = \frac{4}{\pi}\ |x\ \cos^{-1} x -\sqrt{1-x^{2}} + \frac{(1-x^{2})^{\frac{3}{2}}}{3}|_{0}^{\frac{1}{2}}= .688849968669...$ (2)

Kind regards

$\chi$ $\sigma$
 

Attachments

  • MHB15.PNG
    MHB15.PNG
    1,022 bytes · Views: 114
  • #15
chisigma said:
Clearly it is not limitative to suppose that the first point $x_{0}$ and in this case, as illustrated in the figure...
View attachment 170

... the probability that, setting $x_{1}$ the second random point, is $|x_{1}-x_{0}|<1$ is the ratio between the two circular segments separated by the line $x=\frac{x_{0}}{2}$ and the area of the unit circle, i.e. ...

$\displaystyle P\{|x_{1}-x_{0}|<1\} = \frac{2}{\pi}\ (\cos ^{-1} \frac{x_{0}}{2} - \frac{x_{0}}{2}\ \sqrt{1-\frac{x_{0}^{2}}{4}})$ (1)

But $x_{0}$ is uniformly distributed from 0 to 1, so that the requested probability is...

$\displaystyle P= \frac{4}{\pi} \int_{0}^{\frac{1}{2}} (\cos^{-1} x -x\ \sqrt{1-x^{2}})\ dx = \frac{4}{\pi}\ |x\ \cos^{-1} x -\sqrt{1-x^{2}} + \frac{(1-x^{2})^{\frac{3}{2}}}{3}|_{0}^{\frac{1}{2}}= .688849968669...$ (2)

Kind regards

$\chi$ $\sigma$

\( x_0 \) is not uniformly distributed from 0 to 1 since it is the radial component of a random point in the unit circle in polars.

\( \displaystyle p(x_0)= \frac{x_0}{\pi}\) for \(x_0 \in [0,1]\) and zero otherwise.

CB
 
Last edited:
  • #16
CaptainBlack said:
\( x_0 \) is not uniformly distributed from 0 to 1 since it is the radial component of a random point in the unit circle in polars.

\( \displaystyle p(x_0)= \frac{x_0}{\pi}\) for \(x_0 \in [0,1]\) and zero otherwise.

CB

Of course we have to agree on the definition of 'random point inside the unit circle'. The definition I adopted is a complex number of the form...

$\displaystyle x_{0}= \rho\ e^{i\ \theta}$ (1)

... where $\rho$ is a R.V. uniformly distributed from 0 to 1 and $\theta$ is a R.V. uniformly distributed from $-\pi$ to $+\pi$. As in all problem with circular symmetry You can set $\theta=0$ so that $x_{0}$ is a real R.V. uniformly distributed from 0 to 1. If the definition is different, of course all must be revised... but what is that 'different definition'?...

Kind regards

$\chi$ $\sigma$
 
  • #17
chisigma said:
Of course we have to agree on the definition of 'random point inside the unit circle'. The definition I adopted is a complex number of the form...

$\displaystyle x_{0}= \rho\ e^{i\ \theta}$ (1)

... where $\rho$ is a R.V. uniformly distributed from 0 to 1 and $\theta$ is a R.V. uniformly distributed from $-\pi$ to $+\pi$. As in all problem with circular symmetry You can set $\theta=0$ so that $x_{0}$ is a real R.V. uniformly distributed from 0 to 1. If the definition is different, of course all must be revised... but what is that 'different definition'?...

Kind regards

$\chi$ $\sigma$

The definition of a uniform distribution on a region \(R\) of \(\mathbb{R}^n\) containing a \(n\) dimensional ball, is something like that any sub-region has probability of occurring equal to the ratio of its (hyper)volume to that of \(R\).

It is not my definition, it is the definition (with concessions to approximating the measure theory form of the definition)

CB
 
  • #18
chisigma said:
Posted on 11 22 2011 on http://www.scienzematematiche.it [original in Italian...]...

What is the probability P that the distance between two random points inside an n-sphere of radious 1 is less than 1?...

I suggest to start with n=2, i.e. the 'sphere' is a circle...

Kind regards

$\chi$ $\sigma$


It is fairly easy to provide some estimates of the required probability using Monte-Carlo methods. My calculations give:

Code:
            N         p_est      SE
          -----------------------------
            1        0.7441    0.0043
            2        0.5866    0.0049
            3        0.4706    0.0050
            4        0.3712    0.0048
            5        0.3153    0.0046
            6        0.2608    0.0044
            7        0.2154    0.0041
            8        0.1714    0.0038
            9        0.1445    0.0035
           10        0.1180    0.0032

where N is the dimension of the problem, p_est is the MC estimate of the probability and SE is the approximate standard error of the estimate.

CB
 
  • #19
The original question was about 'the distance between two random point in an n-sphere of radius 1'... I proposed a precise definition of 'random point' in the case n=2 and someone seems don't agree with me... never mind!... but I don't understand what is the correct definition of 'random point' in a 2-sphere of radius 1... can someone solve my doubt, please!...

Kind regards

$\chi$ $\sigma$
 
  • #20
chisigma said:
Clearly it is not limitative to suppose that the first point $x_{0}$ and in this case, as illustrated in the figure...
View attachment 170

... the probability that, setting $x_{1}$ the second random point, is $|x_{1}-x_{0}|<1$ is the ratio between the two circular segments separated by the line $x=\frac{x_{0}}{2}$ and the area of the unit circle, i.e. ...

$\displaystyle P\{|x_{1}-x_{0}|<1\} = \frac{2}{\pi}\ (\cos ^{-1} \frac{x_{0}}{2} - \frac{x_{0}}{2}\ \sqrt{1-\frac{x_{0}^{2}}{4}})$ (1)

But $x_{0}$ is uniformly distributed from 0 to 1, so that the requested probability is...

$\displaystyle P= \frac{4}{\pi} \int_{0}^{\frac{1}{2}} (\cos^{-1} x -x\ \sqrt{1-x^{2}})\ dx = \frac{4}{\pi}\ |x\ \cos^{-1} x -\sqrt{1-x^{2}} + \frac{(1-x^{2})^{\frac{3}{2}}}{3}|_{0}^{\frac{1}{2}}= .688849968669...$ (2)

Kind regards

$\chi$ $\sigma$

... if 'random point inside a unit circle' means that the probability to find that point inside a region of area A included in the unit circle is $\frac{A}{\pi}$, then $x_{0}$ effectively isn't uniformly distributed from 0 to 1 but its p.d.f. is $2\ x$. In that case the requested probability is...

$\displaystyle P= \frac{4}{\pi}\ \int_{0}^{1} \left(x\ \cos^{-1} \frac{x}{2} -\frac{x}{2}\ \sqrt{1-\frac{x^{2}}{4}}\right)\ dx $
....$\displaystyle = \frac{4}{\pi}\ \left| \left(\frac{x^{2}}{2}-1\right)\ \cos^{-1} \frac{x}{2} - \frac{x\ \sqrt{4-x^{2}}}{4} + \frac{x\ (4-x^{2})^{\frac{3}{2}}}{16} - \frac{x\ \sqrt{4-x^{2}}}{8} - \frac{1}{2}\ \sin^{-1} \frac{x}{2}\right|_{0}^{1}$

....$ = .586503328433...$

Kind regards

$\chi$ $\sigma$
 
Last edited by a moderator:
  • #21
Posted on 05 12 2012 on http://www.scienzematematiche.it/ bythe member francifamy [original in Italian language...] and not yet properly solved...

A boy is hitchhiking on a land road where passes a mean of 1 car every 10 minutes,according with Poisson’s statistic. If the probability that a car takes on board the boy is .1, what is the probability that in 30 minutes no car takes on board the boy?...

Kind regards

$\chi$ $\sigma$

 
  • #22
chisigma said:
Posted on 05 12 2012 on http://www.scienzematematiche.it/ bythe member francifamy [original in Italian language...] and not yet properly solved...

A boy is hitchhiking on a land road where passes a mean of 1 car every 10 minutes,according with Poisson’s statistic. If the probability that a car takes on board the boy is .1, what is the probability that in 30 minutes no car takes on board the boy?...

Kind regards

$\chi$ $\sigma$



The number of cars in a time interval that would give the boy a lift has a Poisson distribution with 1/10 the mean of the number of cars.

CB
 
  • #23
CaptainBlack said:
The number of cars in a time interval that would give the boy a lift has a Poisson distribution with 1/10 the mean of the number of cars.

CB

Very well!... if we assume the time unit to be 30 minutes, then $\lambda=3$ and the probability that pass k cars in 30 minutes is...

$\displaystyle P_{k}= e^{- \lambda}\ \frac{\lambda^{k}}{k!}= e^{-3}\ \frac{3^{k}}{k!}$ (1)

For each car the probability don't to take on board the boy is $p=.9$. so that the requested probability is...

$\displaystyle P= e^{- \lambda}\ \sum_{k=0}^{\infty} \frac{(\lambda p)^{k}}{k!}= e^{-.3} = .7408182206817...$ (2)

Kind regards

$\chi$ $\sigma$
 
  • #24
chisigma said:
Very well!... if we assume the time unit to be 30 minutes, then $\lambda=3$ and the probability that pass k cars in 30 minutes is...

$\displaystyle P_{k}= e^{- \lambda}\ \frac{\lambda^{k}}{k!}= e^{-3}\ \frac{3^{k}}{k!}$ (1)

For each car the probability don't to take on board the boy is $p=.9$. so that the requested probability is...

$\displaystyle P= e^{- \lambda}\ \sum_{k=0}^{\infty} \frac{(\lambda p)^{k}}{k!}= e^{-.3} = .7408182206817...$ (2)

Kind regards

$\chi$ $\sigma$

The mean number of potential lifts in 30 minutes is 0.3

CB
 
  • #25
CaptainBlack said:
The mean number of potential lifts in 30 minutes is 0.3

CB

Before today it was obvious for me that, if the expected value of arriving cars in 10 minutes is 1, then the expected number of arriving cars in 30 minutes would be 3... may be that recently new regulations have been introduced? (Wasntme)...

Kind regards

$\chi$ $\sigma$
 
  • #26
chisigma said:
Before today it was obvious for me that, if the expected value of arriving cars in 10 minutes is 1, then the expected number of arriving cars in 30 minutes would be 3... may be that recently new regulations have been introduced? (Wasntme)...

Kind regards

$\chi$ $\sigma$

Only 1 in ten will be available for a lift

CB
 
  • #27
I apologize for the fact that I supplied the solution without explaining the solving procedure. It is requested to find the probability of the following event: in a time T none of the arriving cars takes on board the boy. The statistic of the arriving cars is 'Poisson', so that the probability of k cars arriving in T is...

$\displaystyle P_{k}= e^{- \lambda}\ \frac{\lambda^{k}}{k!}$ (1)

If we indicate with q the probability that a car takes on board the boy and p=1-q is the probability that that doesn't happen, then the requested probability is the sum over k of the products of the $P_{k}$ and probability that none of the k cars takes on board the boy that is equal to $p^{k}$, so that...

$\displaystyle P = \sum_{k=0}^{\infty} p^{k}\ P_{k} = e^{- \lambda}\ \sum_{k=0}^{\infty} \frac {(\lambda p)^{k}}{k!} = e^{- \lambda (1-p)} = e^{- \lambda\ q}$ (2)

In the proposed problem we have $\lambda=3$ and $q=.1$, so that we can find P without any other information...

Kind regards

$\chi$ $\sigma$
 
  • #28
Posted on 05 26 2012 on www.mathhelpforum.com by the member jsndacruz and not yet properly solved…

A very simple question has me very confused. Assume that a roulette table has 38 numbers 0, 00, 1, 2 ..., 35, 36. 0 & 00 are green, and the remaining numbers are split evenly between red and black. If a player bets black, the odds of winning and doubling his money is 18/38 = 0.473%.

Consider the following strategy. The player initially bets $1 on black. On each turn, if the player wins, he stops playing. If the player loses, he doubles his bet and keeps playing. Calculate the expected value of the the player's earnings...


Kind regards

$\chi$ $\sigma$
 
  • #29
chisigma said:
Posted on 05 26 2012 on www.mathhelpforum.com by the member jsndacruz and not yet properly solved…

A very simple question has me very confused. Assume that a roulette table has 38 numbers 0, 00, 1, 2 ..., 35, 36. 0 & 00 are green, and the remaining numbers are split evenly between red and black. If a player bets black, the odds of winning and doubling his money is 18/38 = 0.473%.

Consider the following strategy. The player initially bets $1 on black. On each turn, if the player wins, he stops playing. If the player loses, he doubles his bet and keeps playing. Calculate the expected value of the the player's earnings...

If p is the favorable probability in each bet, the probability that all finishes after n bets is...

$\displaystyle P_{n}= p\ (1-p)^{n-1}$ (1)

If 'earnings' means the difference between the gained and the invested money, it is independent from n so that is expected value is...

$\displaystyle 2^{n}-2^{n}+1=1$ (2)

... no matter which is n, so that the 'secure bet' guarantees the gain of 1 dollar. The expected value of n is...

$\displaystyle E\{n\} = p\ \sum_{n=1}^{\infty} n\ (1-p)^{n-1}= \frac{p}{p^{2}}=\frac{1}{p}$ (2)

Wonderful!... not exactly because the expected value of the invested money M is...

$\displaystyle E\{M\} = p\ \sum_{n=1}^{\infty} (2^{n}-1)\ (1-p)^{n-1}= ... $ (3)

... and the series in (3) diverges for $2\ (1-p) \ge 1$ (Dull) ...

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #30
Posted on 05 23 2012 on www.artofproblemsolving.com by the member pablo_roand not yet properly solved…

Let X and Y be two independent random variables with p.d.f. $f_{x}(*)$ and $f_{y}(*)$. Find the p.d.f. of the r.v. U=X Y and V=X/Y...

Kind regards

$\chi$ $\sigma$
 
  • #31
chisigma said:
Posted on 05 23 2012 on www.artofproblemsolving.com by the member pablo_roand not yet properly solved…

Let X and Y be two independent random variables with p.d.f. $f_{x}(*)$ and $f_{y}(*)$. Find the p.d.f. of the r.v. U=X Y and V=X/Y...


Let compute first the 'product distribution' of the r.v. U=X Y. Introducing the marginal variable V we can write...

$\displaystyle U=X\ Y, V=Y \implies X= \frac{U}{V}, Y=V $ (1)

... and compute the Jacobian of the (1) we obtain $\displaystyle J= \frac{1}{|V|}$. Now if we indicate with $\displaystyle f_{x,y}(*,*)$ the joined p.d.f. of X and Y we obtain...

$\displaystyle f_{u}(u)= \int_{-\infty}^{+\infty} \frac{1}{|v|}\ f_{x.y} (\frac{u}{v}, v)\ dv$ (2)

If we want the 'ratio distribution' instead of the 'product distribution' we only have to set in (1) $\displaystyle U=\frac{X}{Y} \implies X = U V$ obtaining...

$\displaystyle f_{u}(u)= \int_{-\infty}^{+\infty} |v|\ f_{x.y} (u\ v, v)\ dv$ (3)

Kind regards

$\chi$ $\sigma$
 
  • #32
Posted the o6 o4 2012 on www.matematicamente.it by stelladinatale [original in Italian…] and not yet solved…

In the time interval (0,t] fellows are born according to a Poisson process with parameter 1. Each fellow has a watch that rings according to a Poisson process with parameter a. When the watch rings the fellow dies. It is requested to demonstrate that the probability that a fellow born in (0,t] is alive at the time t is…

$\displaystyle p(t)= \frac{1-e^{-a\ t}}{a\ t}$

Kind regards

$\chi$ $\sigma$
 
  • #33
chisigma said:
Posted the o6 o4 2012 on www.matematicamente.it by stelladinatale [original in Italian…] and not yet solved…

In the time interval (0,t] fellows are born according to a Poisson process with parameter 1. Each fellow has a watch that rings according to a Poisson process with parameter a. When the watch rings the fellow dies. It is requested to demonstrate that the probability that a fellow born in (0,t] is alive at the time t is…

$\displaystyle p(t)= \frac{1-e^{-a\ t}}{a\ t}$

We know that a fellow was born at the time $\tau$ that is uniformly distributed between 0 and t. The probability that the fellow is alive at the time t is then...

$\displaystyle p(t)= \frac{a}{t}\ \int_{0}^{t} \int_{t-\tau}^{\infty} e^{-a\ x} d \tau\ dx = \frac{1}{t}\ \int_{0}^{t} e^{- a\ (t-\tau)}\ d \tau = \frac{1-e^{-a\ t}}{a\ t}$ (1)

Kind regards

$\chi$ $\sigma$
 
  • #34
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$
 
  • #35
chisigma said:
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$


The wording here is pretty obscure but seems to imply computing the integral using Monte-Carlo methods. In which case we would need a specific value for n.

CB
 
  • #36
chisigma said:
Posted on 05 21 2012 on [FONT=&]www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...

Kind regards

$\chi$ $\sigma$

This seems to describe the average distance of a randomly placed $n$-dimensional point in the unit hypercube, from the origin. For what it's worth I had Mathematica churn out the definite integrals for the first few $n$. I was able to get analytical solutions for up to $n = 3$, but the closed-form expression for $n = 3$ is too unwieldy to post here. For $n > 11$, it takes a long time to produce a result:

$\begin{array}{|l|r|}
n &\mathrm{Integral} \\
~ &~ \\
1 & \frac{1}{2} \\
2 &\frac{\sqrt{2} + \sinh^{-1}{1}}{3} = 0.765 \\
3 &0.961 \\
4 &1.12 \\
5 &1.26 \\
6 &1.39 \\
7 &1.50 \\
8 &1.61 \\
9 &1.71 \\
10 &1.81\\
11 &1.90\\
\end{array}$

It seems to grow at a decreasing rate. The integral is relatively well approximated by $\sqrt{\frac{n}{3}}$, and the approximation gets increasingly better as $n$ increases. This result is trivially obtained by taking the square root outside the integral - this is asymptotically valid as $n$ grows large - the Euclidian metric is a poor choice of distance function in high-dimensional space (curse of dimensionality).

I don't see a way to nicely calculate a closed-form expression of the integral with respect to $n$, but I suspect it is possible. That's all I can say though (Sadface)
 
  • #37
Just tp show the power of Monte-Carlo as a method for estimating high order multiple integrals I attach a plot of the integral against order:

View attachment 182

Note 1: the last lable on the horizontal axis has been clipped, it is 10000.

Note 2: the standard error is more or less independednt of n, and is less than 0.01 taking the average over 1000 sample points for each n.

Note 3: Looking at the curve in detail it is possible to detect what appears to be an a deviation from the asymptotic form \( \sqrt{n/3} \), IIRC it looks like \( I_n \approx \sqrt{n/3}-0.03 \) is a better approximation over the range investigated.
 

Attachments

  • MC integral.PNG
    MC integral.PNG
    3.3 KB · Views: 101
Last edited:
  • #38
chisigma said:
Posted on 05 21 2012 on www.artofproblemsolving.com by the member pablo_ro and not yet properly solved...

Computing integral using the random variables...

$\displaystyle \int_{0}^{1} ... \int_{0}^{1} \sqrt{x_{1}^{2}+ ...+ x_{n}^{2}}\ d x_{1}...d x_{n}$


Of course that is not a question in the area of probability, anyway that is a suggestive question the solution of which is not trivial...



Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$

 
  • #39
chisigma said:
Several years ago [see http://www.osti.gov/bridge/servlets/purl/919496-nVsPUt/919496.pdf...] D.H. Bayley, J.M. Boewein, R.E. Crandall examined the family of complex variable functions...

$\displaystyle B_{n}(s)= \int_{0}^{1} ... \int_{0}^{1} (x_{1}^{2}+...+x_{n}^{2})^{\frac{s}{2}}\ d x_{1} ... d x_{n}$ (1)

... and they arrived to the one dimensional integral formula...

$\displaystyle B_{n}(s)= \frac{s}{\Gamma(1-\frac{s}{2})}\ \int_{0}^{\infty} \frac{1-b(u)^{n}}{u^{1+s}}\ du$ (2)

... where...

$\displaystyle b(u)= \int_{0}^{1} e^{- u^{2}\ x^{2}}\ dx = \frac{\sqrt{\pi}}{2}\ \frac{\text{erf}\ (u)}{u}$ (3)

... and as collateral result to the asymptotic relation...

$\displaystyle B_{n}(s) \sim (\frac{n}{3})^{\frac{s}{2}}\ \{1 + \frac{s\ (s-2)}{10\ n} + ...\}$ (4)

Kind regards

$\chi$ $\sigma$



Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB
 
Last edited:
  • #40
CaptainBlack said:
Interesting, I will have to have another run of the MC, since for \(s=1\) that gives:

\(I_n \sim \sqrt{\frac{n}{3}}-\frac{1}{10\sqrt{3n}}\)

which is significantly different from what I recall, and would have been undetectable.

CB

Well increasing the sample size to 1000000 seems to indicate the above asymptotic form is better yet (over the range checked).

CB
 
  • #41
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$
 
  • #42
chisigma said:
Posted on 06 06 2012 on www.mathhelpforum.com by tttcomrader and noy yet solved...

Let X and Y be random variables with distribution f(*) and g(*) and write thye Laplace Transform of them F(*) and G(*). Show that...

$\displaystyle E\ \{e^{- \lambda\ X\ Y} \} = \int_{0}^{\infty} F(\lambda\ y)\ d G(y)\ dy = \int_{0}^{\infty} G(\lambda\ y)\ d F(y)\ dy$

Kind regards

$\chi$ $\sigma$

Is their anything to do? Writing out the definition of the expectation and doing one of the integrals should suffice.

CB
 
  • #43
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$
 
  • #44
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...

Kind regards

$\chi$ $\sigma$

The number attended in 3 hrs has a Poisson distribution with a mean of 36.

CB
 
  • #45
chisigma said:
Posted on 06 03 2012 on www.talkstat.com by the member jumpydad and not yet properly solved…

Here is an exercise that i don't understand how to solve: imagine a balcony that attends clients. For every 10 min. the number of people getting attended follows a Poisson distribution with an expected value of 2. The balcony only works from 9 am till 12 am, and only attends 40 people at max. Now the question is: what is the probability of not attending all clients in one morning (9am till 12am)?...


Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=36$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 36}\ \sum_{n=0}^{40} \frac{36^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e^(-+36)+36^j/+j!,+j=0...+40

... so that is $P_{ow}= .222897574398...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{36}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #46
chisigma said:
Very well!... in a time T=180 min. the expected number of people getting attended is $\displaystyle \lambda=32$ and the probability to have exactly n people is...

$\displaystyle P_{n}= e^{- \lambda}\ \frac{\lambda^{n}}{n!}$ (1)

The probability of not attending all clients is...

$\displaystyle P_{ow}= 1- e^{- 32}\ \sum_{n=0}^{40} \frac{32^{n}}{n!}$ (2)

At this point the problem is the computation of the sum in (2), that can be performed, for example, using wolframalpha...

http://www.wolframalpha.com/input/?i=sum+e%5E%28-+32%29+32%5Ej%2F+j%21%2C+j%3D0...+40

... so that is $P_{ow}= .070660852878...$. If wolframalpha isn't allowable, then the sum in (2) can efficiently computed as explained in...

http://www.mathhelpboards.com/threads/426-Difference-equation-tutorial-draft-of-part-I

... as the 40.th term of the sequence defined by the difference equation...

$\displaystyle a_{n+1}=\frac{n+1}{32}\ a_{n}+1\ ,\ a_{0}=1$ (3)

Kind regards

$\chi$ $\sigma$

\( 18 \times 2=36 \)

CB
 
  • #47
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...

Kind regards

$\chi$ $\sigma$
 
  • #48
chisigma said:
Posted on 06 05 2012 on www.talkstat.com by the member Youler and not yet solved…

When cycling home at night, I notice that sometimes my rear light is switched o when I arrive home. Presumably the switch is loose and can flip from on to o or back again when I go over bumps. I suppose that the number n of flippings per trip has a Poisson distribution...

$\displaystyle P \{ k=n\} = e^{−\lambda}\ \frac{\lambda^{n}}{n!}$

If the probability that the light is still on when I arrive home is p, find $\lambda$...


If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}$ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

Kind regards

$\chi$ $\sigma$
 
  • #49
chisigma said:
If we indicate with $P_{on}$ and $P_{off}$ the probabilities that the light is on or off, then is... $\displaystyle P_{on}= e^{- \lambda}\ \sum_{n\ even} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \cosh \lambda= \frac{1+ e^{-2\ \lambda}}{2}$ (1)

$\displaystyle P_{off}= e^{- \lambda}\ \sum_{n\ odd} \frac{\lambda^{n}}{n!}= e^{- \lambda}\ \sinh \lambda= \frac{1- e^{-2\ \lambda}}{2}= $ (2)

The problem seems to be solved... but in fact is required to find $\lambda$ as function of $P_{on}$ and not $P_{on}$ as function of $\lambda$, so that the inversion of (1) or (2) isn necessary and that will be done in a successive post...

From the pratical point of view it is esasier to find $\lambda$ as function of $q=P_{off}$ and after operate, if necessary, the substitution $ p=1-q$. The procedure is relatively easy...

$\displaystyle q=\frac{1-e^{-2 \lambda}}{2} \implies 1-2\ q= e^{-2\ \lambda} \implies \lambda= - \ln \sqrt{1-2\ q}= - \ln \sqrt {2\ p-1}$ (1)

... where is $0< q < \frac{1}{2}$ and $\frac{1}{2}< p < 1$...

Kind regards

$\chi$ $\sigma$
 
  • #50
Posted on 06 15 2012 on www.mathhelpforum.com by the member saravananbs and not yet solved…

… if x and y are independent random variable such that f(x)= e-x , x> or = 0 g(y)=3e-3y, y>or = 0 find the probability distribution function of z=x/y how it can be taken forward…

Kind regards

$\chi$ $\sigma$
 

Similar threads

Back
Top