
#1
Jul2204, 12:49 AM

P: 190

for a normalized distribution function f(p) over a finite interval I,
f(p) >= 0, Int f(p) dp (over I) = 1 the standard deviation is given by the square root of <p^2>  <p>^2 = Int p^2 f(p) dp  (Int p f(p) dp)^2 how to find the lowest upper bound (if exists) of the standard deviation? Variational Calculus? 


Register to reply 
Related Discussions  
Mean Absolute Deviation/Standard Deviation Ratio  Set Theory, Logic, Probability, Statistics  6  
least upper bound  Calculus  5  
Upper bound  Calculus  18  
Upper bound/Lower Bound  Set Theory, Logic, Probability, Statistics  10  
Proof of Least Upper Bound  Calculus  2 