Recent content by St41n

  1. S

    Is it possible to perform supremum with two parameters in the same function?

    Can I do this: sup_(x,y) f(x,y) = sup_y sup_x f(x,y) ?
  2. S

    Distribution with no fractional moments?

    I actually meant E|X|^p < ∞ to include RV's on R
  3. S

    Distribution with no fractional moments?

    Is it possible that for a random variable X there exist no p in (0,1) so that EX^p < ∞ ? Is there any example?
  4. S

    Can Minkowski's Inequality Prove Summation Inequality for Positive Numbers?

    Thank you very much for the quick reply!
  5. S

    Can Minkowski's Inequality Prove Summation Inequality for Positive Numbers?

    I don't understand how it is possible to show using the Minkowski's Inequality that (\sum x_i )^a \leq \sum x_i^a where x_i \geq 0 \forall i and 0<a<1 . I also tried to prove this without using Minkowski, but to no avail. This is driving me crazy although it seems to be trivial in...
  6. S

    Is the Indefinite Integral of a Riemann Integrable Function Always Continuous?

    So the integral of a Riemann integrable function is continuous. Thanks!
  7. S

    Is the Indefinite Integral of a Riemann Integrable Function Always Continuous?

    Suppose f: R -> R is integrable Then, is F, the indefinite integral of f, a continuous function? If this is not always true, what conditions do we need. I know that if f is continuous, F is also continuous. What if f is a step function? Can you think of any other interesting cases? I'm...
  8. S

    Different positive/negative error values

    What do you mean? Why can't you use standard deviation?
  9. S

    Can Elog(x) Be Infinite for Some Distributions?

    Let x>0 be a random variable with some distribution with finite mean and let E denote the expectation with respect to that distribution. By Jensen's inequality we have Elog(x) =< logE(x) < +inf But, does this imply that -inf < Elog(x) too? Or is it possible that Elog(x) = -inf Sorry if my...
  10. S

    Showing Convergence: 0<β<1, n*β^n -> 0

    Let 0<β<1 So, β^n -> 0 as n -> infty Also, we can find γ>1 so that {γ^n}*{β^n} -> 0 as n -> infty e.g. γ = β^{-(1/2)} My question is how can i show that: n*β^n -> 0 as n -> infty and there exists γ>1 so that: {γ^n}*{n*β^n} -> 0 as n -> infty I appreciate any help
  11. S

    Question about probability of union

    yes i know this is the definition of P(A U B), but does it imply anything about the sums when we have inequalities? Also, I made some changes to my original post. Can you take a look again?
  12. S

    Question about probability of union

    If x1, x2 positive random variables and we have the following two events: A={x1 > δ} B={x2> k-δ} where 0<δ<k then is it true that: P(A U B) = P( x1+x2 > δ+(k-δ)=k ) ? If true can you explain why is that? Thank you
Back
Top