Correlation coefficient between continuous functions

AI Thread Summary
The discussion revolves around the correlation coefficient between continuous functions, particularly how to express it in terms of continuous probability distributions. The user seeks clarification on transitioning from independent probability density functions to a joint cumulative distribution function. A reference to Hoeffding's lemma is mentioned as a potential solution. The user specifies a case where P(x) is a uniform distribution, and seeks the joint probability distribution for transformations involving sine and cosine functions. It is concluded that the joint distribution may not be necessary, as expected values can be computed using one-dimensional integrals.
natski
Messages
262
Reaction score
2
Hi all,

The correlation coefficients (Pearson's) is usually defined in terms of discrete sampling of a function. However, I have seen that the mean and standard deviation, for example, are also typically written in terms of discrete variables BUT may also be expressed in terms of a continuous probability distribution. e.g. the mean may be written as \mu_x = \int x p(x) dx.

So my question is, does there exist a similar formalism for the correlation coefficient between two continuous probability distributions? Any help would be greatly appreciated on this issue for which many Google searches came up empty handed. :-)

Natski
 
Physics news on Phys.org
I found a reference: Cuadras 2002, On the Covariance between Functions, which has a cited solution form 1940 known as Hoeffding's lemma. This lemma is based on the cumulative distribution functions.

Does anyone know how to go from two independent P(x) and P(y) to C(x,y) where P's are the prob density functions and C is the multivariate cumulative distribution function?

Natski
 
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.
 
natski said:
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.

The joint distribution isn't needed because you already have U and V as functions of X, so E(UV) etc are 1d integrals.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Back
Top