Correlation coefficient between continuous functions

Click For Summary

Discussion Overview

The discussion centers around the correlation coefficient between continuous functions and the potential for a formalism analogous to that of discrete variables. Participants explore the relationship between continuous probability distributions and correlation, particularly in the context of transformations involving trigonometric functions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant questions whether a correlation coefficient can be defined for continuous probability distributions, similar to how mean and standard deviation are expressed.
  • Another participant references a solution from Cuadras 2002 and Hoeffding's lemma, seeking clarification on transitioning from independent probability density functions to a multivariate cumulative distribution function.
  • A participant corrects their earlier statement about independence, specifying that the uniform distribution P(x) is defined on the interval [0, Pi] and relates to the functions u = Sin[x] and v = Cos[x].
  • There is a discussion about the challenges of deriving the joint probability distribution G(x,y) and the use of Jacobians in transforming from univariate to bivariate distributions.
  • One participant suggests that the joint distribution may not be necessary since U and V are already defined as functions of X, implying that expected values can be computed as one-dimensional integrals.

Areas of Agreement / Disagreement

Participants express differing views on the necessity and methodology for deriving joint distributions and correlation coefficients in the context of continuous functions. The discussion remains unresolved with multiple competing perspectives.

Contextual Notes

Participants note limitations in their understanding of the transformation process from univariate to bivariate distributions, particularly regarding the computation of Jacobians and determinants.

natski
Messages
262
Reaction score
2
Hi all,

The correlation coefficients (Pearson's) is usually defined in terms of discrete sampling of a function. However, I have seen that the mean and standard deviation, for example, are also typically written in terms of discrete variables BUT may also be expressed in terms of a continuous probability distribution. e.g. the mean may be written as \mu_x = \int x p(x) dx.

So my question is, does there exist a similar formalism for the correlation coefficient between two continuous probability distributions? Any help would be greatly appreciated on this issue for which many Google searches came up empty handed. :-)

Natski
 
Physics news on Phys.org
I found a reference: Cuadras 2002, On the Covariance between Functions, which has a cited solution form 1940 known as Hoeffding's lemma. This lemma is based on the cumulative distribution functions.

Does anyone know how to go from two independent P(x) and P(y) to C(x,y) where P's are the prob density functions and C is the multivariate cumulative distribution function?

Natski
 
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.
 
natski said:
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.

The joint distribution isn't needed because you already have U and V as functions of X, so E(UV) etc are 1d integrals.
 

Similar threads

  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K