Correlation coefficient between continuous functions

natski
Messages
262
Reaction score
2
Hi all,

The correlation coefficients (Pearson's) is usually defined in terms of discrete sampling of a function. However, I have seen that the mean and standard deviation, for example, are also typically written in terms of discrete variables BUT may also be expressed in terms of a continuous probability distribution. e.g. the mean may be written as \mu_x = \int x p(x) dx.

So my question is, does there exist a similar formalism for the correlation coefficient between two continuous probability distributions? Any help would be greatly appreciated on this issue for which many Google searches came up empty handed. :-)

Natski
 
Physics news on Phys.org
I found a reference: Cuadras 2002, On the Covariance between Functions, which has a cited solution form 1940 known as Hoeffding's lemma. This lemma is based on the cumulative distribution functions.

Does anyone know how to go from two independent P(x) and P(y) to C(x,y) where P's are the prob density functions and C is the multivariate cumulative distribution function?

Natski
 
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.
 
natski said:
Actually correction to my last post, the two variables are not independent. Here is the problem restated for my exact case.

P(x) is a uniform distribution and = (1/Pi) for all x between 0 and Pi

Now let u = Sin[x] and v = Cos[x]

What is the joint probability distribution G(x,y)? I know how to do univariate and bivariate transformations by using the Jacobian but to go from univariate to a bivariate seems to require a 2 x 1 Jacobian for which no determinant can be computed. This is where I get stuck.

The joint distribution isn't needed because you already have U and V as functions of X, so E(UV) etc are 1d integrals.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Back
Top