Graduate Differential of Multiple Linear Regression

Click For Summary
The discussion focuses on the interpretation of changes in a log-level regression model, specifically how to derive the relationship between changes in the dependent variable Y and changes in an independent variable Xk. The initial formula suggests that an increase in Xk leads to a percentage increase in Y, but the transition from partial to total derivatives is highlighted as problematic without considering dependencies among variables. The correct total derivative incorporates contributions from all independent variables, leading to a more accurate understanding of how Y changes with respect to Xk. Ultimately, it concludes that Y increases by YβkδXk when Xk increases, assuming other variables remain constant. This emphasizes the importance of accounting for variable interdependencies in regression analysis.
fny
Say you have a log-level regression as follows:

$$\log Y = \beta_0 + \beta_1 X_1 + \beta_1 X_2 + \ldots + \beta_n X_n$$

We're trying come up with a meaningful interpretation for changes Y due to a change in some Xk.

If we take the partial derivative with respect to Xk. we end up with

$$\frac{dY}{Y} = \beta_k \cdot dX_k$$

which implies that if Xk. increases by 1, you expect Y to increase by 100βk percent.

Can someone walk through the calculus to get from this

$$\frac{\partial}{\partial X_k} \log{y}= \frac{\partial}{\partial X_k} (\beta_0 + \beta_1 X_1 + \beta_1 X_2 + \ldots + \beta_n X_n)$$

to this

$$\frac{dY}{Y} = \beta_k dX_k$$?

I'm particularly confused about how one transitions from a partial derivate to a total derivative.
 
Physics news on Phys.org
In general one cannot make that transition. The second last formula is correct but the last is not, unless there is no dependence between ##X_k## and any of the other ##X_j##s. A corrected version of the last formula is:
$$
\frac{dY}{Y} = \sum_{j=1}^n \beta_j dX_j
$$

To get the total derivative wrt ##X_k## we use the total derivative formula, for the case where ##Y## is a function of ##X_1,...,X_n##:

$$\frac{dY}{dX_k}=\sum_{j=1}^n \frac{\partial Y}{\partial X_j} \frac{dX_j}{dX_k}$$

In this case we have ##Y = \exp\left(\beta_0 + \sum_{k=1}^j \beta_j X_j\right)## so that ##\frac{\partial Y}{\partial X_j} = \beta_jY##, and we also have ##\frac{d X_k}{d X_k}=1##, so that the total derivative becomes:
$$\frac{dY}{dX_k}=Y\left(\beta_k + \sum_{\substack{j=1\\j\neq k}}^n \beta_j \frac{dX_j}{dX_k}\right)$$

This reduces to the formula you wrote above for the total derivative if all the ##\frac{dX_j}{dX_k}## are zero, ie if there are no dependences between ##X_k## and any of the other ##X_j##.

What we can say is that ##Y## increases by ##Y\beta_k\delta X_k## if ##X_k## increases by ##\delta X_k## and all other variables do not change.
 
  • Like
Likes fny and FactChecker

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
5K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 21 ·
Replies
21
Views
3K