MHB Unsolved statistics questions from other sites, part II

  • Thread starter Thread starter chisigma
  • Start date Start date
  • Tags Tags
    Statistics
Click For Summary
The discussion centers on unsolved statistics questions from various forums, specifically focusing on a problem involving a game played by Daniel with a random number generator. Daniel wins money when a specific number is selected but loses money otherwise, and the objective is to determine the expected number of button presses until he has more money than he started with. Participants discuss the mathematical formulation of the problem, including the computation of probabilities and expected values associated with the game. Ambiguities in the problem's wording are also highlighted, particularly regarding the stopping conditions of the game. The thread aims to collaboratively find solutions to these statistical challenges.
  • #91
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 06 05 2013 on www.mathhelpforum.com by the user Nazarin and not yet solved...

Consider a machine which is turned off when there is no work. It is turned on and restarts work when enough orders, say N, arrived to the machine. The setup times are negligible. The processing times are exponentially distributed with mean 1/mu and the average number of orders arriving per unit time is lambda (<mu ). Suppose that lambda = 20 orders per hour, (1/mu) = 2 minutes and that the setup cost is 54 dollar. In operation the machine costs 12 dollar per minute. The waiting cost is 1 dollar per minute per order. Determine, for given threshold N, the average cost per unit time...

The average cost per unit time is the sum of three components...

a) the setup cost $C_{s}$...

b) the operating cost $C_{o}$...

c) the waiting time order cost $C_{w}$...

If $\lambda$ is the mean number of orders arriving in the unit time [we supposed that u.t. is 1 hour...], then the mean number of pochets of N orders arriving in the unit time is $\displaystyle \frac{\lambda}{N}$, so that is $\displaystyle C_{s}= 54\ \frac{\lambda}{N}$. The operating time to process N orders is $\displaystyle \frac{N}{\mu}$, so that is $\displaystyle C_{o}= 720\ \frac{N}{\mu}$. The mean waiting time order is $\displaystyle \frac{1}{2}\ (\frac{N}{\lambda} + \frac{N}{\mu})$ so that the mean waiting time cost is $\displaystyle 30\ N^{2}\ (\frac{1}\lambda + \frac{1}{\mu})$. The total mean cost per hour will be... $\displaystyle C=C_{s} + C_{o} + C_{w} = 54\ \frac{\lambda}{N} + 720\ \frac{N}{\mu} + 30\ N^{2}\ (\frac{1}{\lambda} + \frac{1}{\mu})\ (1)$

C can easily be computed setting in (1) $\lambda = 20\ $ and $\mu=30\ $ ... Kind regards

$\chi$ $\sigma$
 
Last edited:
Mathematics news on Phys.org
  • #92
Re: Unsolved statistic questions from other sites, part II

Posted the 05 17 2013 on www.artofproblemsolving.com by the user doom48 and not yet solved...

Given that $\displaystyle Z \sim N (0,1)$ and n is a positive integer, prove that... $\displaystyle E \{Z^{2 n}\} = \frac{(2 n)!}{n!\ 2^{n}}$



Kind regards

$\chi$ $\sigma$
 
  • #93
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 05 17 2013 on www.artofproblemsolving.com by the user doom48 and not yet solved...

Given that $\displaystyle Z \sim N (0,1)$ and n is a positive integer, prove that... $\displaystyle E \{Z^{2 n}\} = \frac{(2 n)!}{n!\ 2^{n}}$


The moment generating function of $\displaystyle N(0,1)$ is...

$\displaystyle m(x)= E \{e^{Z\ x}\} = \frac{1}{\sqrt{2 \pi}}\ \int_{- \infty}^{+ \infty} e^{x\ z}\ e^{- \frac{z^{2}}{2}}\ dz = e^{\frac{x^{2}}{2}}\ (1)$

... and by definition is...

$\displaystyle E \{Z^{2 n}\} = \lim_{x \rightarrow 0} \frac {d^{2 n}}{d x^{2 n}} m(x)\ (2)$

Remembering the definition of Hermite Polynomial of order k...

$\displaystyle He_{k} (x)= (-1)^{k}\ e^{\frac{x^{2}}{2}}\ \frac {d^{k}}{d x^{k}} e^{- \frac{x^{2}}{2}}\ (3)$

... with symple steps we derive that is...

$\displaystyle E \{ z^{2 n}\} = He_{2 n} (0) = (2 n -1)! = \frac{(2 n)!}{2^{n}\ n!}\ (4)$

Kind regards

$\chi$ $\sigma$
 
  • #94
Re: Unsolved statistic questions from other sites, part II

Posted the 07 01 2010 on www.artofproblemsolving by the user hsiljak and not yet solved…

… a particle performing a random walk on the integer points of the semi-axis $x \ge 0$ moves a distance 1 to the right with probability a, and to the left with probability b, and stands still in the remaining cases [if x=0 it stands still instead of moving to the left]. Determine the steady-state probability distribution, and also the expectation of $x$ and $x^2$ over a long time, if the particle starts at the point 0…

Kind regards

$\chi$ $\sigma$
 
  • #95
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 07 01 2010 on www.artofproblemsolving by the user hsiljak and not yet solved…

… a particle performing a random walk on the integer points of the semi-axis $x \ge 0$ moves a distance 1 to the right with probability a, and to the left with probability b, and stands still in the remaining cases [if x=0 it stands still instead of moving to the left]. Determine the steady-state probability distribution, and also the expectation of $x$ and $x^2$ over a long time, if the particle starts at the point 0…

The position of the particle after n steps can be schematized as a Markov chain with the following state diagram...

http://dmqg0yef478ix.cloudfront.net/87/ad/i73182599._szw380h285_.jpg

The probability transition matrix is...$\displaystyle P = \left | \begin{matrix} 1-a & a & 0 & 0 & \ & \ & \ & 0 & 0 & 0 \\ b & 1-a-b & a & 0 & \ & \ & \ & 0 & 0 & 0 \\ 0 & b & 1 – a - b & a & \ & \ & \ & 0 & 0 & 0 \\ \ & \ & \ & \ & \ & \ & \ & \ & \ & \ \\ \ & \ & \ & \ & \ & \ & \ & \ & \ & \ \\ \ & \ & \ & \ & \ & \ & \ & \ & \ & \ \\ \ & \ & \ & \ & \ & \ & \ & \ & \ & \ \\ \ & \ & \ & \ & \ & \ & \ & \ & \ & \ \\ 0 & 0 & 0 & 0 & \ & \ & \ & b & 1 – a - b & a \\ 0 & 0 & 0 & 0 & \ & \ & \ & 0 & 0 & 1 \end{matrix} \right| $ (1)

The expected value of X after n steps starting from 0 is...

$\displaystyle E \{ X\} = \sum_{k=1}^{n} k\ p_{k}\ (2)$

... where $\displaystyle p_{k}$ is the k-th element of the row 0 of $\displaystyle P^{n}$. Let’s evaluate some expected values…

$\displaystyle n=1;\ p_{1}=a;\ E\{X\}_{n=1} = a$

$\displaystyle n=2;\ p_{1}= 2\ a - a\ b -2\ a^{2},\ p_{2}= a^{2};\ E\{X\}_{n=2} = 2\ a - a\ b\ $

$\displaystyle n=3;\ p_{1}= 3\ a^{3} + (5\ b -6)\ a^{2} + (b^{2} - 3\ b + 3)\ a,\ p_{2}= -3\ a^{3} + (3 - 2\ b)\ a^{2},\ p_{3}= a^{3};\ E \{X\}_{n=3} = b\ a^{2} + (b^{2} -3\ b + 3)\ a\ $

Clearly if n increases the task becomes more and more hard because the n-th power of the matrix P has to be computed. In next post we will try a different approach to the problem...

Kind regards

$\chi$ $\sigma$
 
  • #96
Re: Unsolved statistic questions from other sites, part II

Posted the 05 27 2013 on www.artofproblemsolving.com by the user juliancasaa and not yet solved... 1- At the airport in some city the number of flights arriving at a time is a random variable with Poisson distribution with mean of 5. What is the probability that the time duration until you get the third flight is at most two hours?...

2.The time in years elapsed from the time of removal until the death of the employee, in a factory, it is a random variable with exponential distribution with mean time B of 7 years. If it pensionan 10 employees of the factory, what is the probability that at most seven are still alive at the end of 10 years?...


Kind regards

$\chi$ $\sigma$
 
  • #97
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 05 27 2013 on www.artofproblemsolving.com by the user juliancasaa and not yet solved... 1- At the airport in some city the number of flights arriving at a time is a random variable with Poisson distribution with mean of 5. What is the probability that the time duration until you get the third flight is at most two hours?...


If the mean number of arriving flights in one hour is 5, in two hours it is 10. The requested probability is...

$\displaystyle P = 1 - e^{-10}\ \sum_{k=0}^{3} \frac{10^{k}}{k!} = .98966...$Kind regards $\chi$ $\sigma$
 
  • #98
Re: Unsolved statistic questions from other sites, part II

chisigma said:
Posted the 05 27 2013 on www.artofproblemsolving.com by the user juliancasaa and not yet solved... 2.The time in years elapsed from the time of removal until the death of the employee, in a factory, it is a random variable with exponential distribution with mean time B of 7 years. If it pensionan 10 employees of the factory, what is the probability that at most seven are still alive at the end of 10 years?...

The probability that one pensioned is dead after 10 years is... $\displaystyle p = 1 - \frac{1}{7}\ e^{- \frac{10}{7}} = .76...\ (1)$ The requested probability is... $\displaystyle P = 1 - \sum_{k=0}^{3} \binom{10}{k} p^{k}\ (1-p)^{10-k}\ (2)$ The effective computation of (2) can be performed with a standard calculator and it is left to the reader...

Kind regards

$\chi$ $\sigma$
 
  • #99
Posted the 08 17 2013 on www.artofproblemsolving.com by the user mathemath and not yet solved...

If characteristic function of a random variable X is $\varphi (t)$, what is the characteristic function of 1/X?...

Kind regards

$\chi$ $\sigma$
 
  • #100
chisigma said:
Posted the 08 17 2013 on www.artofproblemsolving.com by the user mathemath and not yet solved...

If characteristic function of a random variable X is $\varphi (t)$, what is the characteristic function of 1/X?...

The problem is very interesting!... first step is to remember that the definition of characteristic function of a r.v. X with p.d.f. $f_{X} (x)$ is... $\displaystyle \varphi_{X} (t)= \int_{- \infty}^{+ \infty} f(x)\ e^{i\ t\ x}\ dx\ (1)$

... and that $\displaystyle f_{X} (x)$ can be obtained from $\displaystyle \varphi_{X} (t)$ with the inversion formula...

$\displaystyle f_{X} (x) = \frac{1}{2 \pi}\ \int_{- \infty}^{+ \infty} \varphi_{X} (t)\ e^{- i\ t\ x}\ dt\ (2)$

The second step is, setting $\displaystyle Y= \frac{1}{X}$ to verify that the p.d.f. of Y is...

$\displaystyle f_{Y} (x) = \frac{1}{x^{2}}\ f_{X} (\frac{1}{x})\ = \frac{1}{2 \pi x^{2}}\ \int_{- \infty}^{+ \infty} \varphi_{X} (t)\ e^{- i\ \frac{t}{x}}\ dt\ (3)$

The third step is the use of (1) to compute...

$\displaystyle \varphi_{Y} (t) = \frac{1}{2 \pi}\ \int_{- \infty}^{+ \infty}\ \int_{- \infty}^{+ \infty}\ \frac{\varphi_{X} (t)}{x^{2}}\ e^{i\ t\ (x-\frac{1}{x})}\ dt\ dx\ (4)$

It has to be evaluated if is possible to write the expression (4) in a more easy form...

Kind regards

$\chi$ $\sigma$
 
  • #101
Posted the 8 25 2013 on www.artofproblemsolving.com by the user adrianomeis and not yet solved...

Three sectors are chosen at random from circle C, having angles $\displaystyle \frac{\pi}{10}, \frac{2 \pi}{10}, \frac{3 \pi}{10}$ respectively. What is the probability that these sectors have no points (other than the centre) in common?...

Kind regards

$\chi$ $\sigma$
 
  • #102
chisigma said:
Posted the 8 25 2013 on www.artofproblemsolving.com by the user adrianomeis and not yet solved...

Three sectors are chosen at random from circle C, having angles $\displaystyle \frac{\pi}{10}, \frac{2 \pi}{10}, \frac{3 \pi}{10}$ respectively. What is the probability that these sectors have no points (other than the centre) in common?...

The problem is [relatively] easy if we condider that, calling $\displaystyle S_{k}$ the sector the angle of which is $\displaystyle \frac{k}{10} \pi$, ...

a) $\displaystyle S_{3}$ can be considered without limitations the sector for which is $\displaystyle 0< \theta < \frac{3}{10} \pi$...

b) all the favourable cases are represented by the distict sequence $\displaystyle S_{3}-S_{2}- S_{1}$ because the other sequence $\displaystyle S_{3}-S_{1}-S_{2}$ is the same swapinng $\displaystyle \theta$ and $\displaystyle - \theta$...

Under these assumptions the requested probability is...

$\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{2}{10} \pi}^{\frac{19}{10} \pi} d y = \frac{49}{200}$

Kind regards

$\chi$ $\sigma$

P.S. : as explained in the post # 105, the result is wrong and the correct result should be $\displaystyle P= \frac{49}{100}$
 
Last edited:
  • #103
Posted on 8 27 2013 on www.artofproblemsolving.com by the user aktyw19 and not yet solved...

Points A, B and C are randomly chosen inside a circle. A fourth point, O is chosen. What is the probability that O lies inside triangle ABC?...

Kind regards

$\chi$ $\sigma$
 
  • #104
chisigma said:
The problem is [relatively] easy if we condider that, calling $\displaystyle S_{k}$ the sector the angle of which is $\displaystyle \frac{k}{10} \pi$, ...

a) $\displaystyle S_{3}$ can be considered without limitations the sector for which is $\displaystyle 0< \theta < \frac{3}{10} \pi$...

b) all the favourable cases are represented by the distict sequence $\displaystyle S_{3}-S_{2}- S_{1}$ because the other sequence $\displaystyle S_{3}-S_{1}-S_{2}$ is the same swapinng $\displaystyle \theta$ and $\displaystyle - \theta$...

Under these assumptions the requested probability is...

$\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{2}{10} \pi}^{\frac{19}{10} \pi} d y = \frac{49}{200}$

Kind regards

$\chi$ $\sigma$

Which is wrong, since your second integral does not allow for the $$S_1$$ sector to fall in the gap between $$S_3$$ and $$S_2$$ also the upper limit in the first integral should be [math] {\small{\frac{18}{10}}}\pi[/math].

Monte-Carlo simulation (which could have error but ..) gives a probability of no overlap of $$0.75$$ with SE roughly $$1.5 \times 10^{-4}$$.

.
 
  • #105
chisigma said:
The problem is [relatively] easy if we condider that, calling $\displaystyle S_{k}$ the sector the angle of which is $\displaystyle \frac{k}{10} \pi$, ...

a) $\displaystyle S_{3}$ can be considered without limitations the sector for which is $\displaystyle 0< \theta < \frac{3}{10} \pi$...

b) all the favourable cases are represented by the distict sequence $\displaystyle S_{3}-S_{2}- S_{1}$ because the other sequence $\displaystyle S_{3}-S_{1}-S_{2}$ is the same swapinng $\displaystyle \theta$ and $\displaystyle - \theta$...

Under these assumptions the requested probability is...

$\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{2}{10} \pi}^{\frac{19}{10} \pi} d y = \frac{49}{200}$

What I said in the point b) is 'half correct' and 'half wrong', in the sense that all the favourable cases are represented by both the $\displaystyle S_{3}-S_{2}- S_{1}$ and $\displaystyle S_{3}-S_{1}-S_{2}$ sequences. To understand that let suppose that the sectors have angles $\displaystyle \theta_{3} = \theta_{2}= \theta_{1}=0$, so that the no overlapping probability is of course P=1. If we consider only the sequence $\displaystyle S_{3}-S_{2}- S_{1}$ and proceed we obtain... $\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{0}^{2 \pi} d x \int_{x}^{2 \pi} d y = \frac{1}{2}$

... and that demonstrates that also the sequence $\displaystyle S_{3}-S_{1}-S_{2}$ must be taken into account. Proceeding along this way we obtain...

$\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{2}{10} \pi}^{\frac{19}{10} \pi} d y + \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{1}{10} \pi}^{\frac{18}{10} \pi} d y = 2\ \frac{49}{200} = \frac{49}{100}$

Kind regards

$\chi$ $\sigma$
 
  • #106
chisigma said:
What I said in the point b) is 'half correct' and 'half wrong', in the sense that all the favourable cases are represented by both the $\displaystyle S_{3}-S_{2}- S_{1}$ and $\displaystyle S_{3}-S_{1}-S_{2}$ sequences. To understand that let suppose that the sectors have angles $\displaystyle \theta_{3} = \theta_{2}= \theta_{1}=0$, so that the no overlapping probability is of course P=1. If we consider only the sequence $\displaystyle S_{3}-S_{2}- S_{1}$ and proceed we obtain... $\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{0}^{2 \pi} d x \int_{x}^{2 \pi} d y = \frac{1}{2}$

... and that demonstrates that also the sequence $\displaystyle S_{3}-S_{1}-S_{2}$ must be taken into account. Proceeding along this way we obtain...

$\displaystyle P = \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{2}{10} \pi}^{\frac{19}{10} \pi} d y + \frac{1}{4 \pi^{2}}\ \int_{\frac{3}{10} \pi}^{\frac{17}{10} \pi} d x \int_{x + \frac{1}{10} \pi}^{\frac{18}{10} \pi} d y = 2\ \frac{49}{200} = \frac{49}{100}$

Kind regards

$\chi$ $\sigma$
Which agrees with the double checked MC estimate of $$0.4900 \pm 0.0003 (2 SE)$$

Python Script:
Code:
import numpy as np

a=np.pi/10;b=2.0*np.pi/10;c=3.0*np.pi/10
N=10000000

theb=np.random.rand(1,N)*2.0*np.pi
thec=np.random.rand(1,N)*2.0*np.pi

thed=thec-theb;thee=theb-thec

test1=np.logical_and((theb>a),(theb+b<2*np.pi))
test2=np.logical_and((thec>a),(thec+c<2*np.pi))

test3=np.logical_and(np.logical_and((thed>0),(thed>b)),(thed+c<2*np.pi))
test4=np.logical_and(np.logical_and((thee>0),(thee>c)),(thee+b<2*np.pi))
test5=np.logical_or(test3,test4)

TEST0=np.logical_and(test1,test2)
TEST=np.logical_and(TEST0,test5)

PP=1.0*np.sum(TEST)/N
SE=np.sqrt(PP*(1-PP)*N)/N

print PP,SE

(errors in previous MC estimate due to still learning Python and so misusing element wise logical operators on numpy arrays - probably).
 
Last edited:
  • #107
chisigma said:
Posted on 8 27 2013 on www.artofproblemsolving.com by the user aktyw19 and not yet solved...

Points A, B and C are randomly chosen inside a circle. A fourth point, O is chosen. What is the probability that O lies inside triangle ABC?...

The requeste probability is the ratio between the area of ABC and the area of the circle and clearly we can suppose that the circle is the unit circle. If $\displaystyle (x_{1},y_{1}), (x_{2},y_{2}),(x_{3},y_{3})$ are the coordinates of A,B and C, the area of the triangle is...

$\displaystyle A= \frac{1}{2}\ (x_{1} y_{2} - x_{2} y_{1} + x_{2} y_{3} - x_{3} y_{2} + x_{3} y_{1} - x_{1} y_{3})\ (1)$

... and now we pass to polar coordinates...

$\displaystyle x_{1}= r_{1}\ \cos \theta_{1},\ y_{1}= r_{1}\ \sin \theta_{1}$

$\displaystyle x_{2}= r_{2}\ \cos \theta_{2},\ y_{2}= r_{2}\ \sin \theta_{2}$

$\displaystyle x_{3}= r_{3}\ \cos \theta_{3},\ y_{3}= r_{3}\ \sin \theta_{3}$

It is not a limitation to suppose $\displaystyle \theta_{1}=0$, so that the (1) becomes...

$\displaystyle A = \frac{1}{2}\ \{r_{1} r_{2} \sin \theta_{2} + r_{2} r_{3} \sin (\theta_{3} - \theta_{2}) - r_{1} r_{3} \sin \theta_{3} \}\ (2)$

It is fully evident that the contribution of the first and third term into bracketts is 0, so that is...

$\displaystyle E \{A\} = 2 \int_{0}^{1} \int_{0}^{1} \int_{0}^{2 \pi} r_{2}^{2} r_{3}^{2} (\frac{1}{\pi} - \frac{x}{2})\ \sin x\ d r_{2} d r_{3} d x = \int_{0}^{1} \int_{0}^{1} \int_{0}^{2 \pi} r_{2}^{2} r_{3}^{2} |\sin x - x \cos x|_{0}^{2 \pi} d r_{2} d r_{3} = $

$\displaystyle = \pi\ \int_{0}^{1} \int_{0}^{1} r_{2}^{2} r_{3}^{2} d r_{2} d r_{3} = \frac{\pi}{3} \int_{0}^{1} r_{2}^{2}\ d r_{2} = \frac{\pi}{9}\ (3)$

... so that the requested probability is $\displaystyle P = \frac{1}{9}$...

Kind regards

$\chi$ $\sigma$
 
  • #108
Posted the 09 26 2013 on Math Help Forum - Free Math Help Forums by the user JellyOnion and not jet solved...

John is shooting at a target. His probabiltiy of hitting the target is 0.6. What is the minimum number of shots needed for the probability of John hitting the target exactly 5 times to be more then 25%?...

Kind regards

$\chi$ $\sigma$
 
  • #109
chisigma said:
Posted the 09 26 2013 on Math Help Forum - Free Math Help Forums by the user JellyOnion and not jet solved...

John is shooting at a target. His probabiltiy of hitting the target is 0.6. What is the minimum number of shots needed for the probability of John hitting the target exactly 5 times to be more then 25%?...

What we have to do is the computation of the probability of at least 5 hits in n shots...

$\displaystyle n=5,\ p= (.6)^{5} = .07776 \\ n=6,\ p = (.6)^{6} + 6\ (.6)^{5}\ (.4) = .23328 \\ n=7,\ p=(.6)^{7} + 7\ (.6)^{6}\ (.4) + 21\ (.6)^{5}\ (.4)^{2} = .419904$

... so that the minimum number of shots is n=7...

Kind regards

$\chi$ $\sigma$
 
  • #110
Posted the 09 29 2013 on www.artofproblemsolving.com by the user tensor and not jet solved...

... three points chosen randomly on a circle... find the probability that those points form a obtuse angled triangle...

Kind regards

$\chi$ $\sigma$
 
  • #111
Posted the 10 02 2013 on www.mathhelpforum.com by the user Nora314 and not yet solved...

Let X and Y be independent Poisson distributed stochastic variables, with expectation values
5 and 10, respectively. Calculate the following probabilities: X + Y > 10​


Kind regards

$\chi$ $\sigma$​
 
  • #112
chisigma said:
Posted the 10 02 2013 on www.mathhelpforum.com by the user Nora314 and not yet solved...

Let X and Y be independent Poisson distributed stochastic variables, with expectation values
5 and 10, respectively. Calculate the following probabilities: X + Y > 10​




If we have two r.v. X and Y Poisson distributed with mean values $\lambda_{x}$ and $\lambda_{y}$ is...

$\displaystyle P \{ X = n \} = \frac{\lambda_{x}^{n}}{n!}\ e^{- \lambda_{x}},\ P \{ Y = n \} = \frac{\lambda_{y}^{n}}{n!}\ e^{- \lambda_{y}}\ (1)$If Z = X + Y is another r.v., then is... $\displaystyle P \{ Z = n \} = \sum_{k =0}^{n} \frac{\lambda_{x}^{k}}{k!}\ e^{- \lambda_{x}}\ \frac{\lambda_{y}^{n-k}}{(n-k)!}\ e^{- \lambda_{y}} = e^{- (\lambda_{x} + \lambda_{y})}\ \sum_{k = 0}^{n} \frac{\lambda_{x}^{k}\ \lambda_{y}^{n-k}}{k!\ (n-k)!} = $

$\displaystyle = \frac{e^{- (\lambda_{x} + \lambda_{y})}}{n!} \sum_{k = 0}^{n} \binom {n}{k}\ \lambda_{x}^{k}\ \lambda_{y}^{n-k} = \frac{e^{- (\lambda_{x} + \lambda_{y})}}{n!}\ (\lambda_{x} + \lambda_{y})^{n}\ (2)$... so that Z is also Poisson distributed with mean value $\displaystyle \lambda= \lambda_{x} + \lambda_{y}$. In our case is $\lambda = 15$ so that the probability that $Z \le 10$ is... $\displaystyle P = e^{- 15}\ \sum_{n=0}^{10} \frac{15^{n}}{n!} = .117634656...\ (3)$ ... and the requested probability is... $\displaystyle 1 - P = .88236534...\ (4)$Kind regards $\chi$ $\sigma$

 
  • #113
Posted the 10 08 2013 on www.mathhelpforum.com by the user Nora314 and not yet solved...

Consider a parallell system of 2 independent components. The lifetime of each component is exponentially distributed with parameter $\lambda$. Let V be the lifetime of the system. Find the distribution of V , and E(V )...

Kind regards

$\chi$ $\sigma$
 
  • #114
chisigma said:
Posted the 10 08 2013 on www.mathhelpforum.com by the user Nora314 and not yet solved...

Consider a parallell system of 2 independent components. The lifetime of each component is exponentially distributed with parameter $\lambda$. Let V be the lifetime of the system. Find the distribution of V , and E(V )...

The p.d.f. of the life T time of each component is... $\displaystyle f(t) = \lambda\ e^{- \lambda\ t},\ t \ge 0\ (1)$

... so that, if $\displaystyle V= \text {max} [T_{1},T_{2}]$, the c.d.f. of V is... $\displaystyle F_{v} (v) = P \{T_{1} \le v\}\ P \{T_{2} \le v\} = (1 - e^{- \lambda\ v})^{2} = 1 - 2\ e^{- \lambda\ v} + e^{-2\ \lambda\ v}\ (2)$

The p.d.f. of V is obtained deriving (2)... $\displaystyle f_{v} (v) = 2\ \lambda\ (e^{- \lambda\ v} - e^{- 2\ \lambda\ v})\ (3)$

... and the expected value of V is...

$\displaystyle E \{ V \} = 2\ \lambda\ \int_{0}^{\infty} v\ (e^{- \lambda\ v} - e^{- 2\ \lambda\ v})\ dv = \frac{3}{2\ \lambda}\ (4)$

Kind regards

$\chi$ $\sigma$
 
  • #115
Posted the 10 22 2013 on www.artofproblemsolving.com by the user robmath and not yet solved...

A point leaps in the Euclidean plane. Starts at (0,0) and, for each hop, if X is initially in a position, choose a vector of length 1 and uniformly random, and jumps to the position X + v. After three jumps, what is the probability that the point is in the unit disk...

Kind regards

$\chi$ $\sigma$
 
  • #116
chisigma said:
Posted the 10 22 2013 on www.artofproblemsolving.com by the user robmath and not yet solved...

A point leaps in the Euclidean plane. Starts at (0,0) and, for each hop, if X is initially in a position, choose a vector of length 1 and uniformly random, and jumps to the position X + v. After three jumps, what is the probability that the point is in the unit disk...

If we indicate a vector of modulus 1 with $\displaystyle e^{i\ \theta}$, $\theta$ being uniformely distributed from $- \pi$ and $\pi$, then the final position is represented by...

$\displaystyle s = e^{i\ \theta_{1}} + e^{i\ \theta_{2}} + e^{i\ \theta_{3}}\ (1)$

... so that the problem consists to find the probability that the quantity $\displaystyle |s|$ is less than 1. It is convenient to break the work in several parts and we first suppose to sum two vectors...

$\displaystyle u = 1 + e^{i\ \theta_{1}}\ (2)$

... with $\displaystyle \theta_{1}$ uniformely distributed from $- \pi$ to $\pi$ and to find the p.d.f. of $\displaystyle X= |u|$. Is...

$\displaystyle X = \sqrt{(1 + e^{i\ \theta_{1}})\ (1 + e^{- i\ \theta_{1}})} = \sqrt{2\ (1+ \cos \theta_{1})} = 2\ \cos \frac{\theta_{1}}{2}\ (3)$

... then...

$\displaystyle P \{ X < x\} = \frac{2}{\pi}\ \int_{\cos^{-1} \frac{x}{2}}^{\frac{\pi}{2}} d \theta_{1} = 1 - \frac{2}{\pi}\ \cos^{-1} \frac{x}{2}\ (4)$

... with $\displaystyle 0 < x < 2$ and the p.d.f. of X is...

$\displaystyle f_{X} (x) = \frac{1}{\pi\ \sqrt {1 - \frac{x^{2}}{4}}}\ (5)$

As second step we compute the r.v. Y defined as...

$\displaystyle Y = |X + e^{i\ \theta_{2}}| = (X + e^{i\ \theta_{2}})\ (X + e^{- i\ \theta_{2}}) = X^{2} +2\ X\ \cos \theta_{2} + 1\ (6)$

Now what we have to do is to evaluate the probability $\displaystyle P \{Y < 1\}$ and observing (6) we realize that it ie equal to the probability $\displaystyle P \{X < -2\ \cos \theta_{2}\} $ that is...

$\displaystyle P = \int \int_{A} f_{X} (x)\ f_{Y} (y)\ dy\ dx\ (7)$

... where $f_{X} (x)$ is given by the (5), $f_{Y} (y)$ is given by...

$\displaystyle f_{Y} (y) = \frac{1}{2\ \pi \sqrt{1 - \frac{y^{2}}{4}}}\ (8)$

... and A is the region coloured in yellow in the figure... http://www.123homepage.it/u/i78946596._szw380h285_.jpg.jfif

Explicit computation of P is...

$\displaystyle P = \frac{1}{2\ \pi^{2}}\ \int_{0}^{2}\ \int_{-2}^{- x}\ \frac{d y\ d x}{\sqrt{1 - \frac{x^{2}}{4}}\ \sqrt{1- \frac{y^{2}}{4}}} = \frac{1}{2\ \pi^{2}}\ \int_{0}^{2} \frac{\pi - 2\ \sin^{-1} \frac{x}{2}}{\sqrt{1- \frac{x^{2}}{4}}}\ dx = \frac{\pi^{2}}{4\ \pi^{2}} = \frac{1}{4}\ (9)$

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #117
Posted the 11 06 2013 on www.artofproblemsolving.com by the user MANMAID and not yet solved...

Suppose two teams play a series of games, each producing a winner and loser, until one team has won two more games than the other. Let G be the total number of games played. Assume each team has a chance of 0.5 to win each game, independent of the results of the previous games.

find the probability distribution of G.
find the expected value of G.


Kind regards

$\chi$ $\sigma$
 
  • #118
chisigma said:
Posted the 11 06 2013 on www.artofproblemsolving.com by the user MANMAID and not yet solved...

Suppose two teams play a series of games, each producing a winner and loser, until one team has won two more games than the other. Let G be the total number of games played. Assume each team has a chance of 0.5 to win each game, independent of the results of the previous games.

find the probability distribution of G.
find the expected value of G.

In a slighly different form we have the most classical of the problems concerning Markov Chains. The number of states is five and we can call them simply with 0,1,2,3,4. The initial state is 0 and two two adsorbing states are 3 and 4. The probability transition matrix written in 'canonical form' is...

$\displaystyle P = \left | \begin{matrix} 0 & \frac{1}{2} & \frac{1}{2} & 0 & 0 \\ \frac{1}{2} & 0 & 0 & \frac{1}{2} & 0 \\ \frac{1}{2} & 0 & 0 & 0 & \frac{1}{2} \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 1 \end{matrix} \right| $ (1)

Setting $\displaystyle g(n) = P \{ G=n\}$ it is to see that g(n) = 0 for n odd. For n even, setting $\displaystyle s(n) = a_{0,3} + a_{0,4}$ the top right elements of the matrix $P^{n}$, is $p(n)= s(n)- s(n-2)$. Proceeding we have...

$\displaystyle s(0) = 0\ -> p(0)=0$

$\displaystyle s(2) = \frac{1}{2}\ -> p(2)= \frac{1}{2}$

$\displaystyle s(4) = \frac{3}{4}\ -> p(4)= \frac{1}{4}$

$\displaystyle s(6) = \frac{7}{8}\ -> p(6)= \frac{1}{8}$

$\displaystyle s(8) = \frac{15}{16}\ -> p(8)= \frac{1}{16}$

... and it is fully evident that $\displaystyle p(2\ n)= \frac{1}{2^{n}}$. The expected value of G is therefore...

$\displaystyle E \{ G \} = 2\ \sum_{n=1}^{\infty} \frac {n}{2^{n}} = 4\ (1)$

Kind regards

$\chi$ $\sigma$
 
  • #119
Posted the 11 25 2013 on www.artofproblemsolving.com by the user erbed and not yet solved...

John has n dollars and he starts flipping a fair coin. Each time the result is a head he wins one dollar, otherwise he loses one dollar. This 'game' ends when he reaches the amount of m > n dollars or when he loses all his money. What is the probability that he wins this game?... [Give the answer in terms of n and m...] Kind regards $\chi$ $\sigma$
 
  • #120
Posted the 11 27 2013 on www.talkstats.com by the user TrueTears and not yet solved...Let Z = X + Y where $\displaystyle X \sim N (\mu,\sigma^2)$ and $\displaystyle Y \sim \Gamma (k, \theta)$. Also assume X and Y are independent. Then what is the distribution (pdf) of Z?...

Kind regards

$\chi$ $\sigma$
 

Similar threads

  • · Replies 90 ·
4
Replies
90
Views
62K
  • · Replies 69 ·
3
Replies
69
Views
16K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
10K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 17 ·
Replies
17
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K