afallingbomb
- 16
- 0
I am trying to linearize a function, f(x), where x is a normally distributed N(0,1) random variable. How can I perform a taylor series expansion around a deterministic value x0? Thanks.