Agent Smith
				
				
			 
			
	
	
	
		
	
	
			
		
		
			
			
				- 345
 
- 36
 
- TL;DR
 - What are the uses of variance and standard deviation and how do they differ?
 
Going through my notes ... and I see the following:   
1. Variance = ##\displaystyle \text{Var(X)} = \sigma^2 = \frac{1}{n - 1} \sum_i = 1 ^n \left(x_i - \overline x \right)^2##
2. Standard Deviation = ##\sigma = \sqrt {Var(X)} = \sqrt {\sigma^2}##
Both variance and standard deviation are measures of dispersion (colloquially the spread in the data). Higher their values, more spread out the data is.
Statement B: The square root function is not linear and so standard deviation is biased when compared to variance.
Questions:
1. Do high variance and standard deviation mean greater variability in the data?
2. What does statement B mean?
				
			1. Variance = ##\displaystyle \text{Var(X)} = \sigma^2 = \frac{1}{n - 1} \sum_i = 1 ^n \left(x_i - \overline x \right)^2##
2. Standard Deviation = ##\sigma = \sqrt {Var(X)} = \sqrt {\sigma^2}##
Both variance and standard deviation are measures of dispersion (colloquially the spread in the data). Higher their values, more spread out the data is.
Statement B: The square root function is not linear and so standard deviation is biased when compared to variance.
Questions:
1. Do high variance and standard deviation mean greater variability in the data?
2. What does statement B mean?