Meta Analysis with Several Regression Studies

AI Thread Summary
The discussion revolves around synthesizing results from four regression studies to model one response variable, z_1, as a function of two predictors, x_1 and y_1. The user seeks a comprehensive reference, such as a book or online PDF, that covers advanced regression techniques, particularly in the context of multivariate least squares regression. There is a specific interest in whether the regression models should treat the z-variables and x-variables as random variables, implying a need for total least squares regression. The user also inquires about the covariance estimates between predictor variables, clarifying that they currently only have an estimate for the pair x_1 and y_1. A suitable resource that addresses these topics would greatly assist in their analysis.
quantumdude
Staff Emeritus
Science Advisor
Gold Member
Messages
5,560
Reaction score
24
I have come across a problem that I need to solve, and it isn't your garden variety regression problem. It isn't even covered in any of my books, of which I have many. I need either a book title or an online PDF that covers this material.

Suppose we have a response variable z_1 that depends on predictor variables x_1,x_2,...,x_n. Further suppose that we have another response variable z_2 that depends on predictor variables y_1,y_2,...,y_m.

There are 4 studies to be synthesized.

In Study 1 a regression model z_1=\alpha_0+\alpha_1x_1+\alpha_2x_2+...+\alpha_nx_n is obtained.
In Study 2 a regression model z_2=\beta_0+\beta_1y_1+\beta_2y_2+...+\beta_my_m is obtained.
In Study 3 a correlation between z_1 and z_2 is obtained.
In Study 4 a correlation between x_1 and y_1 is obtained.

The goal is to synthesize these studies to model z_1 as a function of x_1 and y_1 only.

What's a good read to get going on this? Thanks!
 
Physics news on Phys.org
Are these regression models fit by considering both the z-variable and x-variables to be random variables? (e.g. total least squares regression as opposed to least squares regression?)

Are x1 and y1 the only random variables with a given estimated covariance ? - or do all pairs xj, yj have an estimate covariance?
 
Hi Stephen, thanks for replying.

Stephen Tashi said:
Are these regression models fit by considering both the z-variable and x-variables to be random variables? (e.g. total least squares regression as opposed to least squares regression?)

I'm dealing with multivariate least squares regression models.

Are x1 and y1 the only random variables with a given estimated covariance ? - or do all pairs xj, yj have an estimate covariance?

It's just the one pair of predictor variables for which I have an estimated covariance. But leaving that aside, what I really want to know is if there is a comprehensive reference from which I could learn how to combine regression models. It would be a bonus if both cases in your question were covered. Thanks!
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top