- #1
nkinar
- 76
- 0
Hello---
I am reading a paper which describes a somewhat-unfamiliar mathematical procedure. The paper asks for a 2D spectrum [tex]U(t, \omega)[/tex] of a signal [tex]s(t)[/tex] which is calculated using the short-time Fourier transform (Gabor transform). This is reasonably straight-forward.
However, the paper then asks for the 2D spectrum [tex]U(t, \omega)[/tex] to be collapsed into a 1D spectrum [tex]U(t * \omega) = U(\chi)[/tex], where [tex]\chi = t * \omega[/tex].
In the above, [tex]t[/tex] is the time and [tex]\omega = 2 \pi f[/tex] is the angular frequency.
Apparently the 2D spectrum needs to be converted into a 1D spectrum by integration along constant [tex]\chi[/tex].
To me, this appears to be a form of line integral, evaluated numerically.
I can't find a good reference on how to do this, and I don't really know what is meant by integration along constant [tex]\chi[/tex]. Perhaps this means "averaging along constant chi"?
I've tried to simply multiply each element in the rows of [tex]U(t, \omega)[/tex] with each element in the columns of [tex]U(t, \omega)[/tex] using two nested for loops, but a plot of the resulting 1D spectrum "jumps around."
The paper says that the 1D spectrum will decrease monotonically along the [tex]\chi[/tex] axis, and the figures in the paper clearly show this monotonic decrease.
Has anyone seen anything similar, or would someone be able to point me in the right direction?
I am reading a paper which describes a somewhat-unfamiliar mathematical procedure. The paper asks for a 2D spectrum [tex]U(t, \omega)[/tex] of a signal [tex]s(t)[/tex] which is calculated using the short-time Fourier transform (Gabor transform). This is reasonably straight-forward.
However, the paper then asks for the 2D spectrum [tex]U(t, \omega)[/tex] to be collapsed into a 1D spectrum [tex]U(t * \omega) = U(\chi)[/tex], where [tex]\chi = t * \omega[/tex].
In the above, [tex]t[/tex] is the time and [tex]\omega = 2 \pi f[/tex] is the angular frequency.
Apparently the 2D spectrum needs to be converted into a 1D spectrum by integration along constant [tex]\chi[/tex].
To me, this appears to be a form of line integral, evaluated numerically.
I can't find a good reference on how to do this, and I don't really know what is meant by integration along constant [tex]\chi[/tex]. Perhaps this means "averaging along constant chi"?
I've tried to simply multiply each element in the rows of [tex]U(t, \omega)[/tex] with each element in the columns of [tex]U(t, \omega)[/tex] using two nested for loops, but a plot of the resulting 1D spectrum "jumps around."
The paper says that the 1D spectrum will decrease monotonically along the [tex]\chi[/tex] axis, and the figures in the paper clearly show this monotonic decrease.
Has anyone seen anything similar, or would someone be able to point me in the right direction?