I am writing a system that collects values from our devices (CRAH, Generators, etc) compute the trend of the data, and determine if an alarm should be raised. For testing I have a point generator and I want it to follow fairly realistic norms. Currently I generate a new point as: p.Y = p'.Y + rand.NextDouble()-.5; | p' is the previous point, and Y is the value being generated (p.X is a point in time) This is taking the value of the last point and adding a random variant that is +- .5. The problem is that rand will generate a totally linear grouping of numbers. What I want to do is put that point through a computation so that the next point follows a bell curve. Therefore it is more likely to generate a difference of (say) .09 - -.09 then it is .5 and .5 is closer to a 0 chance of generation than any other point. I've used google and searched a few math forums but most of the discussion is from data analysis after the fact not during generation of the points. Any suggestions on a formulae that would produce the desired distribution?