- #1
tamzam
- 1
- 0
Hello,
My question seems to be simple. I would like to numerically solve the following first order ODE to obtain v(x):
v'(x) = b.[v(x) - f(x)] , given boundary condition v(+infinity) = 0 [b is a known constant]
The problem is that f(x) is not known explicitly (f(x) is sampled at a dense x grid).
Can you please help me? any hints will be appreciated...
My question seems to be simple. I would like to numerically solve the following first order ODE to obtain v(x):
v'(x) = b.[v(x) - f(x)] , given boundary condition v(+infinity) = 0 [b is a known constant]
The problem is that f(x) is not known explicitly (f(x) is sampled at a dense x grid).
Can you please help me? any hints will be appreciated...