Can Elog(x) Be Infinite for Some Distributions?

  • Thread starter Thread starter St41n
  • Start date Start date
  • Tags Tags
    Finite
AI Thread Summary
For a random variable x > 0 with a finite mean, Jensen's inequality indicates that E(log(x)) is less than or equal to log(E(x)), which is finite. The discussion raises the question of whether E(log(x)) can be negative infinity, particularly as x approaches zero, where log(x) tends to negative infinity. An example is provided using the distribution x = exp(-1/u) with u uniformly distributed on (0,1). The consensus is that while log(x) is finite for x > 0, the behavior of E(log(x)) can lead to negative infinity under certain conditions. Therefore, E(log(x)) cannot be infinite if x is strictly positive.
St41n
Messages
32
Reaction score
0
Let x>0 be a random variable with some distribution with finite mean and let E denote the expectation with respect to that distribution.
By Jensen's inequality we have Elog(x) =< logE(x) < +inf

But, does this imply that -inf < Elog(x) too? Or is it possible that Elog(x) = -inf

Sorry if my question is stupid. Thx in advance
 
Physics news on Phys.org
If x> 0 then log(x) is always finite and so E(log(x)) must be finite.
 
But when x -> 0 , log(x) -> -inf
 
Try x=exp(-1/u) where u is uniform on (0,1).
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top