- #1
CricK0es
- 54
- 3
- Homework Statement
- RMS and Peak voltages of time slice in dynamic periodic signal
- Relevant Equations
- Parseval Theorem?
Hi all!
I have a complex waveform, something like what I have attached, and I need to extract the RMS and Peak Voltage from this. However, this signal can and will change after a number of cycles on a certain setting. So, this means the period and duration of pulse can change and so can the RMS/Peak. Imagine the entire signal is saved and I can scroll through it and perform DSP at will, but I don't want an average over this. I want specific analysis of each setting.
After I have run the signal through my variable op-amp and ADC (they aren't a problem. Assume sample error find is +/- 0.5% // negligible), conceptually how would I obtain these things using a programmed FPGA? Essentially, how can I ensure I am making the correct time slice for analysis?
Currently, I am thinking about using FFT & parseval's theorm to obtain RMS from my signal slice, with peak just being a max{} search of the slice data. Therefore, I guess my main issue is ensuring that I am able to adapt to slice out the correct period as the signal changes it's period. In my research I have seen a lot of things about using a Hilbert transform and estimating changing/dynamic periods from envelope maxima. Is that along the correct lines?
Nonetheless, thank you for your help. Apologies if it is unclear as naturally it's hard to explain something for which you lack understanding.
I have a complex waveform, something like what I have attached, and I need to extract the RMS and Peak Voltage from this. However, this signal can and will change after a number of cycles on a certain setting. So, this means the period and duration of pulse can change and so can the RMS/Peak. Imagine the entire signal is saved and I can scroll through it and perform DSP at will, but I don't want an average over this. I want specific analysis of each setting.
After I have run the signal through my variable op-amp and ADC (they aren't a problem. Assume sample error find is +/- 0.5% // negligible), conceptually how would I obtain these things using a programmed FPGA? Essentially, how can I ensure I am making the correct time slice for analysis?
Currently, I am thinking about using FFT & parseval's theorm to obtain RMS from my signal slice, with peak just being a max{} search of the slice data. Therefore, I guess my main issue is ensuring that I am able to adapt to slice out the correct period as the signal changes it's period. In my research I have seen a lot of things about using a Hilbert transform and estimating changing/dynamic periods from envelope maxima. Is that along the correct lines?
Nonetheless, thank you for your help. Apologies if it is unclear as naturally it's hard to explain something for which you lack understanding.