1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Discrete fourier transform data of 2 different sampling frequencies

  1. Jul 10, 2014 #1
    Hi All,

    I have a problem I've been thinking about for a while, but I haven't come up with a really satisfactory solution:

    I want to do a discrete fourier transform on data that has been sampled at 2 different sampling frequencies. I've attached a picture of what my data will look like. The goal with these two different sampling rates is to use a long waveform for good resolution in the fourier transform (be able to see low frequencies), and also use high sample rate to see the high frequencies in the data.

    I've considered averaging the high-sampled data and doing a fast fourier transform (FFT) on that, and then also doing another independent FFT of the higher-sample-rate data, and then stitching these FFTs together. The problem is that will give me different resolutions of the FFTs, and I believe also cause some unorthodox aliasing issues.

    Any suggestions/references would be greatly appreciated,

    (I'm familiar with Fourier-transforms for non-uniformly sampled data, (such as the Lomb-Scargle method) but I'm hoping for a simpler solution since my data won't be completely non-uniform, but just at two different frequencies.)

    Attached Files:

  2. jcsd
  3. Jul 10, 2014 #2
    I dont think what you are trying to do will work
    When you use a lower sampling frequency the higher frequency elements will alias and skew your data.

    Could you first run the data through a High pass filter or Low pass filter and then do the DFT?
  4. Jul 10, 2014 #3
    Yes, thanks for clearing that up; I now agree that all the aliasing in my slower-sampled data will give me useless fourier transform information.

    Thank you also for your suggestion re:low pass filtering, and I'd like to pursue this further. I suppose the steps to take would be to (1) put the slower-sampled data through a low-pass filter (2) do a FFT on the results of the filter to get an unaliased spectrum (3) then above the nyquist frequency of the slower sampled data, stitch on the FFT of the higher-sampled data.

    Following these steps, I'd still have the problem of a much poorer resolution at higher frequencies. Since it would be nice to have evenly-sampled spectral data for later analysis of the spectra, I might just break up the larger bins at the higher frequencies into smaller bins.

    Does this sounds like the best solution?, and if so is there a good starting point reference for a digital low-pass filter.

    Thanks again!
  5. Jul 10, 2014 #4
    I think you're on the right track. However you would need to filter the data BEFORE doing the samples. Otherwise filtered or not you'll still have aliased data.

    I'd start by taking a look at Matlab filter design and go from there. I just finished a course on DSP 2 months ago and I don't really remember a heck of a lot about designing a filter. I'll take a look at my notes this weekend and see if anything jogs my memory. In the mean time start with http://www.mathworks.com/discovery/filter-design.html
    or http://myweb.dal.ca/gonzalej/Teaching/ECED4502/ECED4502.html (class notes from my DSP class)
  6. Jul 10, 2014 #5


    User Avatar
    Science Advisor

    How much control do you have over the sampling rates and pre-filtering (before sampling)? Is it completely up to you? If it is, how do you know when to switch to high sampling rate?
  7. Jul 10, 2014 #6
    Maybe I should have provided some more context:

    I have very little control of the sampling rates and the pre-filtering (without some hardware changes to our readout electronics). The electronics have been designed to provide a waveform trace with these two different sampling frequencies. Our electronics know to switch to a high sample rate because it knows where the pulse peak occurs in time (see attached .pdf in my original post).

    To be more accurate (and probably make more sense), the electronics actually digitize the entire waveform trace at the original high frequency (625 kHz), but then when outputting the data it averages the data on each side of the pulse peak by 16 bins.

    This was a nifty solution to reducing data throughput, but it's clearly complicating the spectrum analysis side of things! Because of the difficulty of applying a filter pre-digitization and the points that cpscdave has made, I'm considering looking into a Lomb-Scargle sort of analysis for the fourier transforms.

    Any additional advice would be greatly appreciated.
  8. Jul 10, 2014 #7


    User Avatar
    Science Advisor

    So if it's reasonable to assume that the low-sampling-rate region contains no high-frequency information of interest, then an programmatically simple approach could be to up sample the data so that you have a full 625 Khz record, then apply your favorite spectral estimation method from there.

    cpscdave's suggestion to properly lowpass the 625 kHz section could also work. That should give you a full 40 kHz record without aliasing, then you can join that with a separately-computed spectrum from the higher-resolution section. The resolutions of the different sections will be different, but so what?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook