Taylor Series Linearization of f(x) Around x0

afallingbomb
Messages
16
Reaction score
0
I am trying to linearize a function, f(x), where x is a normally distributed N(0,1) random variable. How can I perform a taylor series expansion around a deterministic value x0? Thanks.
 
Physics news on Phys.org
You just expand as you would in the ordinary case.

f(x)=f(x0) + (x-x0)f'(x0) + ...

x being a random variable has no effect on the expansion. It matters only to the extent you want statistical properties of f(x).
 

Similar threads

Back
Top