Parametric function or statistic?

Click For Summary

Discussion Overview

The discussion revolves around the classification of certain mathematical expressions, specifically whether they represent a statistic or a parametric function. Participants explore concepts related to expected values, samples, and probability distributions, engaging with theoretical implications and definitions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant suggests that E(|x1-x2|) might be a statistic, raising a question about its classification.
  • Another participant argues that "E" represents a probability-weighted average, indicating that it involves the entire set of outcomes rather than just a sample, thus suggesting it is not a statistic.
  • A subsequent post questions whether |x1-x2| itself can be classified as a statistic.
  • Some participants propose that the expected value E(u) is not applicable to specific sample values, as it pertains to random variables or functions of random variables.
  • Clarifications are made regarding the expectation of a constant and the nature of samples versus random variables, emphasizing that once a sample is selected, it is no longer probabilistic.
  • Detailed mathematical examples are provided to illustrate the computation of expected values for both continuous and discrete probability distributions.

Areas of Agreement / Disagreement

Participants express differing views on the classification of E(|x1-x2|) and the application of expected values to samples. The discussion remains unresolved, with multiple competing interpretations of the concepts presented.

Contextual Notes

Participants note the importance of distinguishing between random variables and specific sample values, as well as the implications of using expected values in different contexts. There are unresolved assumptions regarding the definitions of statistics and parametric functions.

EvLer
Messages
454
Reaction score
0
Which one would you say this is :confused: :

E(|x1-x2|)
x1, x2, xn - a sample of n values on the underlying random variable...
I was thinking this is a statistic :frown:
 
Physics news on Phys.org
"E" of something is a probability-weighted average outcome, so it involves the entire set of outcomes rather than just a sample. A statistic involves only a sample. The x's in E[f(X1,x2)] are not sample values, they are shorthands for two random variables or distributions (e.g., "Normal distribution 1" and "Normal distribution 2").
 
so... parametric function?
 
"Probabilistic representation of the set of outcomes." I guess para. func. is acceptable for cases where the prob. rep. reduces to a parametric function.
 
so, then what about |x1-x2|? would you say it is a statistic?
 
Yes, that's a statistic. The initial confusing factor was your use of "E". Did you intend that to be the expectation? If so, E(u) is the expected value of a random variable or function of a random variable. It doesn't make sense to talk about the "expected" value of specific values from a sample.
 
thanks a lot for explaining.
so to make sure I understand, it would be wrong to "do" expected value on a sample? since E's domain is data from entire distribution?
edit: would you call E() a function? if not, what is it formally? sorry if I don't get this right away...
Thank you again.
 
Let c be a (deterministic) constant. Then the expected value of c is always c.

You can take the expectation of a sample, but the result will be the sample itself. That's because a sample {x1, ..., xn} (differently from random vars. {X1, ..., Xn}) is a set of values that have already been determined (by the very act of selecting them from a set of probable outcomes). There is nothing inherently probabilistic or random about a sample, once you have the sample.

And yes, in "standard" theory and applications, E[f(O)] (where "E" is a shorthand for "EP") is a function given by the integral of f(O)dP(O) where f is any (continuous, to be safe) function defined over the set of outcomes O, and P is the cumulative probability distribution defined over O. For discrete cases, it is the inner product f(O).p(O) where p is the density defined over O.

E[f(O)] is the P-measure of the set f(O), and hence the technically correct notation "EP."
 
Last edited:
Examples:
Continuous P, specific parameters: Let U[0,1] be the uniform prob. dist. over the unit interval I=[0,1]: U[0,1](v) = v for v in I. Then E[U[0,1]] = EU[0,1][I] = [itex]\int_0^1 v dU[0,1](v) = \int_0^1 v dv = \frac{v^2}{2}|_0^1 = 1/2[/itex].
Continuous P, general parameters: Let U[a,b] be the uniform prob. dist. over any interval J = [a,b]: U[a,b](v) = (v-a)/(b-a) for v in J. Then
[tex] E[U[a,b]] = E_{U[a,b]}[J] = \int_a^b v dU[a,b](v) <br /> = \int_a^b \frac v{b-a} dv <br /> =\frac{v^2}{2(b-a)}|_a^b = \frac{b^2-a^2}{2(b-a)} = \frac{a+b}2[/tex]

Discrete P, specific params.: Let p be the uniform density function p(v) = 1/2 for all v in O = {0,1}. Then EP[{0,1}] = (1/2,1/2}.{0,1} = (1/2) 1 + (1/2) 0 = 1/2.
Discrete P, gen. params.: Let p be the uniform density function p(v) = 1/n for all v in W = {1, ..., n}. Then EP[W] = (1/n, ..., 1/n}.{1, ..., n} = (1 + ... + n)/n = n(n+1)/(2n) = (n+1)/2.
 
Last edited:

Similar threads

  • · Replies 54 ·
2
Replies
54
Views
7K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 24 ·
Replies
24
Views
7K