neznam said:
\Theta1- log[p1/(1-p1)] where p1 is binomial
\Theta2- log[p2/(1-p2)] where p2 is binomial
How are those formulas going to make sense if the p's are binomial random variables? They will take values from the set 0,1,2,...N. Suppose p1 = 3. Won't you be trying to take the log of a negative number?
What I am looking for really is an expression for the covariance if I have a function of the random variable instead of only a random variable.
My guess at what your trying to say is this:
Let X and Y be random variables. Let f and g be functions of a single variable and let the random variables F and G be defined as F= f(X) and G = g(Y). What is a method for expressing COV(F,G) in terms of COV(X,Y) ?
An even more general question is:
Let X and Y be random variables. Let f and g be functions of two variables and let the random variables F and G be defined as F= f(X,Y) and G = g(X,Y). What is a method for expressing COV(F,G) in terms of COV(X,Y) ?
I don't claim to know the answer to that question. One thought is to use power series expansions.
Let M = E( F^k G^j) be a moment of the joint distribution of (F,G). Expand the function F^k G^j as a power series in X and Y. Then the expectation E( F^k G^j) becomes the sum of expectations like E( C X^r Y^s) where C is a constant that does involves evaluations of partial derivatives of f and g.
The particular case of COV(F,G) involves the particular moments E(F), E(G) and E(FG).
We can work out what the power series method says in that case. (Or if we are lucky, some other keen forum member will do it for us!)