Dispersion or standard deviation?

In summary, the conversation discusses the terms "dispersion" and "standard deviation" as measures of width for statistical distributions. The speaker prefers to use "standard deviation" over "dispersion of coordinate" and suggests that the latter may have been translated from another language. They also mention that many sources equate the two terms. The conversation ends with a question about the name and mathematical representation of ##\Delta \hat{x}##.
  • #1
LagrangeEuler
717
20
In some textbooks ##\Delta \hat{x}## is called dispersion of coordinate ##\Delta \hat{x}=(\langle \hat{x^2} \rangle-\langle \hat{x} \rangle ^2)^{\frac{1}{2}}##. For me that is standard deviation. What do you think?
 
Physics news on Phys.org
  • #2
I have never seen 'dispersion of coordinate' used. It strikes me as something translated from English to another language, and then translated back.
 
  • #3
"Standard deviation" is the term that I've always used; or for its square, the "variance".

A Google search for "dispersion standard deviation" turns up many pages with statements like "the standard deviation is a measure of dispersion". So "dispersion" is the general idea of "width" of a statistical distribution, and the "standard deviation" is a specific way of describing or measuring it mathematically.
 
  • #4
What name you use for ##\Delta \hat{x}##? What is that mathematically for you?
 
  • #5


I understand that terminology can vary in different fields and contexts. However, in the context of statistics and probability, the term "standard deviation" is commonly used to describe the measure of dispersion or variation of a set of data from its mean. In this case, the formula for dispersion of coordinate ##\Delta \hat{x}## is equivalent to the formula for standard deviation. Therefore, I would agree with your interpretation that ##\Delta \hat{x}## can be referred to as standard deviation in this context.
 

What is dispersion or standard deviation?

Dispersion or standard deviation is a measure of the variability or spread of a set of data. It tells us how far the data points deviate from the mean or average of the data set.

How is standard deviation calculated?

The standard deviation is calculated by taking the square root of the sum of the squared differences between each data point and the mean, divided by the total number of data points.

What is the relationship between standard deviation and variance?

Standard deviation and variance are both measures of dispersion, but they differ in the unit of measurement. Standard deviation is the square root of variance, making it more commonly used as it is easier to interpret.

What does a high or low standard deviation indicate?

A high standard deviation indicates that the data points are spread out over a larger range, while a low standard deviation indicates that the data points are close to the mean. In other words, a higher standard deviation suggests that the data set has more variability.

Why is standard deviation important in statistical analysis?

Standard deviation is important in statistical analysis as it helps us understand the distribution of the data and how representative the mean is. It also helps in comparing different data sets and identifying outliers. It is often used to make decisions or draw conclusions about a population based on a sample of data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
905
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
816
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Advanced Physics Homework Help
Replies
10
Views
567
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
933
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
740
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
985
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
Back
Top