Meta Analysis with Several Regression Studies

AI Thread Summary
The discussion revolves around synthesizing results from four regression studies to model one response variable, z_1, as a function of two predictors, x_1 and y_1. The user seeks a comprehensive reference, such as a book or online PDF, that covers advanced regression techniques, particularly in the context of multivariate least squares regression. There is a specific interest in whether the regression models should treat the z-variables and x-variables as random variables, implying a need for total least squares regression. The user also inquires about the covariance estimates between predictor variables, clarifying that they currently only have an estimate for the pair x_1 and y_1. A suitable resource that addresses these topics would greatly assist in their analysis.
quantumdude
Staff Emeritus
Science Advisor
Gold Member
Messages
5,560
Reaction score
24
I have come across a problem that I need to solve, and it isn't your garden variety regression problem. It isn't even covered in any of my books, of which I have many. I need either a book title or an online PDF that covers this material.

Suppose we have a response variable z_1 that depends on predictor variables x_1,x_2,...,x_n. Further suppose that we have another response variable z_2 that depends on predictor variables y_1,y_2,...,y_m.

There are 4 studies to be synthesized.

In Study 1 a regression model z_1=\alpha_0+\alpha_1x_1+\alpha_2x_2+...+\alpha_nx_n is obtained.
In Study 2 a regression model z_2=\beta_0+\beta_1y_1+\beta_2y_2+...+\beta_my_m is obtained.
In Study 3 a correlation between z_1 and z_2 is obtained.
In Study 4 a correlation between x_1 and y_1 is obtained.

The goal is to synthesize these studies to model z_1 as a function of x_1 and y_1 only.

What's a good read to get going on this? Thanks!
 
Physics news on Phys.org
Are these regression models fit by considering both the z-variable and x-variables to be random variables? (e.g. total least squares regression as opposed to least squares regression?)

Are x1 and y1 the only random variables with a given estimated covariance ? - or do all pairs xj, yj have an estimate covariance?
 
Hi Stephen, thanks for replying.

Stephen Tashi said:
Are these regression models fit by considering both the z-variable and x-variables to be random variables? (e.g. total least squares regression as opposed to least squares regression?)

I'm dealing with multivariate least squares regression models.

Are x1 and y1 the only random variables with a given estimated covariance ? - or do all pairs xj, yj have an estimate covariance?

It's just the one pair of predictor variables for which I have an estimated covariance. But leaving that aside, what I really want to know is if there is a comprehensive reference from which I could learn how to combine regression models. It would be a bonus if both cases in your question were covered. Thanks!
 
I'm taking a look at intuitionistic propositional logic (IPL). Basically it exclude Double Negation Elimination (DNE) from the set of axiom schemas replacing it with Ex falso quodlibet: ⊥ → p for any proposition p (including both atomic and composite propositions). In IPL, for instance, the Law of Excluded Middle (LEM) p ∨ ¬p is no longer a theorem. My question: aside from the logic formal perspective, is IPL supposed to model/address some specific "kind of world" ? Thanks.
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top