I was trying to build a probability-related software package and needed to have a theoretical framework to deal with some less common issues (i.e. stuff that you don't find in the average text book). I was hoping that somebody could give me pointers as to where to find the proper formulae. Basically I am trying to figure out how to do arbitrary (symbolic) calculations on random variables which involve both sum and product operations fully taking into account arbitrary correlations between the variables. If I have the correlation coefficients of the variables then obviously summing them is easy. Also if I have the correlation coefficients of their logs multiplying them is easy (since a product involves summing the logs and the log/exponent formulae for normals are well defined). And, of course, determining the resulting correlations between the sums or products and the original variables is straightforward. So where things are more murky (for me at least) is when I go beyond these simple cases. If, say, I want to multiply two sums, how do I handle the correlations? E.g. Let's say z = (a + b) * (c + d). If I know the correlation of a and b I can easily determine the correlation of their sum to a or b. Similarly c + d is easy. But now if I multiply those two sums, how do I determine the correlation of z to the original variables a, b, c, and d? And if all the variables originally had some non-zero correlation how do I take that into account in the product since the result of the original summations gave me the correlation coefficients wrt the original variables but not the correlations of the logarithms which is what I would need for the multiplication? Can anybody give me a clue how to start figuring this out? Thanks.