Understanding Mean Squared Deviation and Its Usage in Statistical Analysis

  • Context: Undergrad 
  • Thread starter Thread starter zeeshahmad
  • Start date Start date
  • Tags Tags
    deviation Mean
Click For Summary

Discussion Overview

The discussion centers around the concept of "Mean Squared Deviation" and its applications in statistical analysis. Participants explore its definition, the mathematical formulation, and the interpretation of notation used in statistics.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Mathematical reasoning

Main Points Raised

  • One participant asks for clarification on the meaning of "Mean Squared Deviation" and the notation .
  • Another participant explains that "mean squared deviation" can refer to two different concepts: the variance (mean squared deviation with degrees of freedom) and the average of squared deviations.
  • A later reply discusses the interpretation of the brackets <..> as representing an average with respect to the whole distribution, linking it to the concepts of expected value.
  • Further elaboration is provided on how expected value is calculated for discrete and continuous variables, suggesting that represents the expected value of the sum.

Areas of Agreement / Disagreement

Participants present multiple interpretations of "Mean Squared Deviation" and the notation used, indicating that there is no consensus on a single definition or understanding. The discussion remains unresolved regarding the precise meaning and application of these terms.

Contextual Notes

There are ambiguities in the definitions provided, particularly regarding the interpretation of "mean squared deviation" and the notation <..>, which depend on context and specific statistical frameworks.

zeeshahmad
Messages
27
Reaction score
0
Could someone explain the meaning of "Mean Squared Deviation"?

Also, in <x1+x2+..xn>
what is the meaning of the pointy brackets <..> ?
 
Physics news on Phys.org
Welcome to PF, zeeshahmad!


Hmm, didn't I see you somewhere else? :wink:

The term "mean squared deviation" is a bit ambiguous and can mean 2 things.
I'll try to explain.


Suppose you have a sample of n measurements x1, x2, ..., xn.

Then the sum of squared deviations, often abbreviated as SS is:
$$SS = \sum (x_i - \bar x)^2$$
where ##\bar x## is the mean.

This set of measurements come with a "degrees of freedom", abbreviated DF.
For a "normal" repeated measurement, we have:
$$DF = n - 1$$

In statistics, when the term "mean squared deviation" is used, it usually means:
$$MS = {SS \over DF}$$
This is exactly the variance (or squared standard deviation) of the sample.


However, taken literally, "mean squared deviation" means just the average of the squared deviations, which is:
$$SS \over n$$



I can't tell you what <x1+x2+..xn> means.
Do you have a context for that?
 
Last edited:
Nice posting to you again :approve:
Actually I have got the lecture notes, in which it tells the meaning, but I don't understand it:

"Consider a distribution with average value μ and standard deviation σ from which a sample measurements are taken, i.e.

\mu = \left\langle x \right\rangle
\sigma^2 = \left\langle x^2 \right\rangle - {\left\langle x \right\rangle}^2

"where the brackets <..> mean an average with respect to the whole distribution."
 
zeeshahmad said:
Nice posting to you again :approve:
Actually I have got the lecture notes, in which it tells the meaning, but I don't understand it:

"Consider a distribution with average value μ and standard deviation σ from which a sample measurements are taken, i.e.

\mu = \left\langle x \right\rangle
\sigma^2 = \left\langle x^2 \right\rangle - {\left\langle x \right\rangle}^2

"where the brackets <..> mean an average with respect to the whole distribution."

Ah, I see what you mean.
<...> as you show it, is also called the "expected value".
The expected valueof a variable X is also written as EX or E(X).

If the variable x can take only specific values ##x_i## with an associated chance of ##p_i##, then in general, the expectation of a function f(x) is:
$$\langle f(x) \rangle = \sum f(x_i)p_i$$
Or if x is a continuous variable, it is:
$$\langle f(x) \rangle = \int f(x)p(x)dx$$
where p(x) is the so called probability density function.So <x1+x2+...xn> would be the expected value of the sum.
This is equal to <x1>+<x2>+...+<xn>.
 
Thankyou for the detailed explanation
:smile:
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
8K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K