I have pulse generator that produces negative Gaussian pulses with fwhm~0.3ns. It is driven by the falling edge of a regular square wave from a function generator. I want to make some kind of measurement on an oscilloscope of the jitter of the pulse by comparing the output from the function generator (ch1) and the output from the pulse generator (ch2). The oscilloscope model I'm using is Lecroy WR620Zi and it has a lot of measurement capabilities that I don't really understand. I've tried the delta delay measurement, which measures the "time betw/ the 50% crossing of first transition of two waveforms," so it seems to give the time delay from the falling edge of the square wave to the pulse. It also gives the standard deviation for the measurements. Is this standard deviation the jitter I'm looking for? Am I going about this the wrong way? Any guidance would be appreciated.