First of all I will use L to denote lambda the parameter of the distribution.
X~Poission(nL), n$\in\Bbb{N}$,
Y~Poisson(mL),m$\in\Bbb{N}$ with m$\ne$n
S= aX+bY a,b real constants.
Given observations x and y find the maximum likelihood estimator of L.
The problem is I don't know what the pmf...
Thanks for the reply.
If it's just induction I think I will just ignore it I thought i was missing some property of ${n \choose r}$.(Never really liked induction)
Repeatedly apply $\binom{n}{r}= \binom{n-1}{r}+\binom{n-1}{r-1}$ to show:
$$\binom{n}{r}=\sum_{i=1}^{r+1}\binom{n-i}{r-i+1}$$
The closest i got was showing you could show different iterations with the binomial coefficients (Pascal's Triangle).
Q:Explain why there can be no random variable for which $M_x(t)=\frac{t}{t-1}$
($M_x(t)$ is the moment-generating function of the random variable $x$.)
A: I tried differentiating it twice and got mean of $x=1$ and variance of $x=1$ which seems fine. Maybe its because $M_x(0)$ is not equal to...
Thanks for that, I sometimes make silly mistakes like that when I get tired.
Also is the maths formatting used here the same as most sites(showing integral sign,etc)? Just not sure if I want to learn it just for this site.
Q The amount of time (in minutes) that an executive of a certain firm talks on the telephone is a random variable having the probability density:
$$f(x) = \begin{cases} \dfrac{x}{4}&\text{for $0 < x \le 2$}\\
\dfrac{4}{x^3}&\text{for $x > 2$}\\...