Want to understand a concept here about dimensions of a function.
Using example 1: a simple Fourier series from http://en.wikipedia.org/wiki/Fourier_series
s(x) = \frac{a_0}{2} + \sum ^{\infty}_{0}[a_n cos(nx) + b_n sin(nx)]
So do we now say that s(x) has an infinite dimensional...
so for "f0 probability that 0 X_i are added", and assuming f0 = {0 \choose n}p^0(1-p)^{n}=(1-p)^n
and so on, then what do I do with all the f0, f1, ...
is the average like a weighted average
pretty lost so any help is greatly appreciated. Thanks.
What kind of problem is this?
X_i \textrm{are iid with known mean and variance, } \mu \textrm{ and } \sigma ^2 \textrm{respectively. }
m \sim \textrm{Binomial(n,p), n is known.}
S = \sum^{m}_{i=1} X_i
How do I work with this? This what I have thought of.
S = \sum^{m}_{i=1} X_i =...
Is there a reference you could point me to or a reason why?
Would it be true if
\mu > \hat{\mu}
and for
\mu < \hat{\mu}
E[|\mu - \hat{\mu}|] = \frac{2}{\sqrt{\pi}}
by going through and doing the actual integration.
\frac{d}{d \sigma ^2} [log(\sigma ^ 2) - \frac{1}{\sigma ^ 2}]
I think the first part is
\frac{1}{\sigma ^ 2}
but pretty clueless after that. I also want to take the second derivative.
Any help or a reference would be great.
Thanks!
Homework Statement
We have A \in R^{mxm} \text{ and } b \in R^{m} \text{ and } b \neq 0 \text{. Show that } Ax = b \text{ and } A(x+ \delta x) = b+ \delta b
The Attempt at a Solution
I did the first part just by the definition of A being non singular. The second part is tripping...
Makes sense it should be a function of w only. I do not understand though how the integral with the minimum is broken up into the two integrals at the end. Any insight?
Thanks for the help.
Homework Statement
Let x,y be iid and x, y \sim U(0,1) (uniform on the open set (0,1)) and let z = xy^2.
Find the density of z.
Homework Equations
The Attempt at a Solution
P(z \leq w) = P(xy^2 \leq w) = P(- \sqrt{\frac{w}{x}} \leq y \leq \sqrt{\frac{w}{x}}) = \int^{...
Homework Statement
Z is a 2x1 multivariate gaussian random vector, where Z = (X Y)^t , X,Y are real numbers, with mean zero and covariance matrix
\Gamma
which is a 2x2 matrix whose entries are
\Gamma_{1,1} = 1
\Gamma_{1,2} = \alpha
\Gamma_{2,1} = \alpha
\Gamma_{2,2} = 1...
They are the same X and Y values.
1 and 2 are totally separate. There is another question about conditional expectation that goes with 2. but I think I can get that part with a little help on 2.
Thanks for the help.
Awesome, thanks for the help. Now where might a good reference be for my second problem.
Is this thought process right?
1. Integrate Y to get the marginal of X to get F
2. Do convolution to get distribution of G = X+Y
3. How do I recover Joint distribution from marginals?