Hi, let's say I want to measure the frequency f of a periodic signal. I may take N data points with an arbitrary timestep of T. The question is how shall I choose T for a fixed N to have the best accuracy? In principle the frequency resolution is 1/(N*T) when taking the Fourier transform, this would suggest to me to choose T as large as possible, i.e. T= 1/ (2*f) (Nyquist limit). However I am not sure of that for several reasons. E.g. one may use least square fitting instead of Fourier transform, and it may increase the resolution, especially if I measure only a few oscillations. One other possibility is to pad up the measured signal with a constant signal (e.g. zeros) -- with this, one artificially increase the total measurement time, and with this the resolution. By the way, what is the theoretical max. accuracy, that one can reach with this latter trick?