I don't understand how it is possible to show using the Minkowski's Inequality that
(\sum x_i )^a \leq \sum x_i^a where x_i \geq 0 \forall i and 0<a<1 .
I also tried to prove this without using Minkowski, but to no avail.
This is driving me crazy although it seems to be trivial in...
Suppose f: R -> R is integrable
Then, is F, the indefinite integral of f, a continuous function?
If this is not always true, what conditions do we need.
I know that if f is continuous, F is also continuous. What if f is a step function?
Can you think of any other interesting cases?
I'm...
Let x>0 be a random variable with some distribution with finite mean and let E denote the expectation with respect to that distribution.
By Jensen's inequality we have Elog(x) =< logE(x) < +inf
But, does this imply that -inf < Elog(x) too? Or is it possible that Elog(x) = -inf
Sorry if my...
Let 0<β<1
So, β^n -> 0 as n -> infty
Also, we can find γ>1 so that
{γ^n}*{β^n} -> 0 as n -> infty
e.g. γ = β^{-(1/2)}
My question is how can i show that:
n*β^n -> 0 as n -> infty
and there exists γ>1 so that:
{γ^n}*{n*β^n} -> 0 as n -> infty
I appreciate any help
yes i know this is the definition of P(A U B), but does it imply anything about the sums when we have inequalities?
Also, I made some changes to my original post. Can you take a look again?
If x1, x2 positive random variables and we have the following two events:
A={x1 > δ}
B={x2> k-δ}
where 0<δ<k
then is it true that:
P(A U B) = P( x1+x2 > δ+(k-δ)=k ) ?
If true can you explain why is that?
Thank you