Standard deviation and Bernoulli distribution

Click For Summary

Discussion Overview

The discussion revolves around the properties of the standard deviation and variance in the context of a Bernoulli distribution. Participants explore whether the relationship between standard deviation and variance affects the applicability of statistical theorems, particularly Chebyshev's inequality.

Discussion Character

  • Technical explanation, Debate/contested

Main Points Raised

  • One participant states that for a Bernoulli distribution, the standard deviation is greater than the variance due to the relationship pq < 1.
  • Another participant questions the concern raised, asking for clarification on why this relationship would matter.
  • A different participant argues that the property of standard deviation being less than variance is not unique to the Bernoulli distribution and does not imply any restrictions on the use of statistical theorems.
  • Another reply reiterates the initial question regarding the usability of standard deviation-based theorems, suggesting that the theorem itself outlines its conditions for applicability.

Areas of Agreement / Disagreement

Participants express differing views on whether the relationship between standard deviation and variance impacts the usability of statistical theorems, indicating that the discussion remains unresolved.

Contextual Notes

Participants have not reached a consensus on the implications of the relationship between standard deviation and variance for the applicability of Chebyshev's inequality and similar theorems.

Kocur
Messages
12
Reaction score
0
Let us assume that X has Bernoulli distribution, with P(X = 1) = p and P(X = 0) = q = 1 - p. Of course, E(X) = p and Var(X) = pq. Now, since pq < 1, standard deviation is bigger than variance.

I have got the following question:

Does this fact make standard deviations and theorems based on standard deviation (like Chebyshev's inequality) unusable in this case?
 
Physics news on Phys.org
I don't see why it should make any difference. Could you elaborate on what's your concern?
 
I don't see why it would. You don't need Bernoulli for this property (sigma < sigma^2), as there is no rule that says the variance of, say, a normal dist. has to be > 1.
 
Does this fact make standard deviations and theorems based on standard deviation (like Chebyshev's inequality) unusable in this case?
Read the theorem -- it tells you exactly when you can use it.
 

Similar threads

  • · Replies 42 ·
2
Replies
42
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K