# Homework Help: General Relativity

1. Apr 18, 2010

### latentcorpse

I have two equations:

$\ddot{x}^\mu + \ddot{y}^\mu + \Gamma^\mu{}_{\nu \lambda} (x+y)(\dot{x}^\nu+\dot{y}^\nu)(\dot{x}^\lambda+\dot{y}^\lambda)=0$
and
$\ddot{x}^\mu + \Gamma^\mu{}_{\nu\lambda}(x) \dot{x}^\nu \dot{x}^\lambda=0$

apparently if i taylor expand the first equation to first order and then subtract the second equation i should get

$\ddot{y}^\mu + \frac{\partial \Gamma^\mu{}_{\nu\lambda}}{\partial x^\rho} \dot{x}^\nu \dot{x}^\lambda y^\rho = 0$

i cannot show this. how do we go about taylor expanding something like that?

2. Apr 18, 2010

### Fredrik

Staff Emeritus
Same way you'd expand any other function $$f:\mathbb R^4\rightarrow \mathbb R$$.

$$f(x+y)=f(x)+y^\rho f_{,\rho}(x)+\mathcal O(y^2)$$