- #1
keliu
- 2
- 0
I have a Gaussian distribution about t, say, N(t; μ, σ), and a a Dirac Delta Function δ(t).
Then how can I compute: N(t; μ, σ) * δ(t > 0)
Any clues? Or recommender some materials for me to read?
Thanks!
Then how can I compute: N(t; μ, σ) * δ(t > 0)
Any clues? Or recommender some materials for me to read?
Thanks!