Numerical integration along constant and collapsing spectrum

1. Mar 25, 2010

nkinar

Hello---

I am reading a paper which describes a somewhat-unfamiliar mathematical procedure. The paper asks for a 2D spectrum $$U(t, \omega)$$ of a signal $$s(t)$$ which is calculated using the short-time Fourier transform (Gabor transform). This is reasonably straight-forward.

However, the paper then asks for the 2D spectrum $$U(t, \omega)$$ to be collapsed into a 1D spectrum $$U(t * \omega) = U(\chi)$$, where $$\chi = t * \omega$$.

In the above, $$t$$ is the time and $$\omega = 2 \pi f$$ is the angular frequency.

Apparently the 2D spectrum needs to be converted into a 1D spectrum by integration along constant $$\chi$$.

To me, this appears to be a form of line integral, evaluated numerically.

I can't find a good reference on how to do this, and I don't really know what is meant by integration along constant $$\chi$$. Perhaps this means "averaging along constant chi"?

I've tried to simply multiply each element in the rows of $$U(t, \omega)$$ with each element in the columns of $$U(t, \omega)$$ using two nested for loops, but a plot of the resulting 1D spectrum "jumps around."

The paper says that the 1D spectrum will decrease monotonically along the $$\chi$$ axis, and the figures in the paper clearly show this monotonic decrease.

Has anyone seen anything similar, or would someone be able to point me in the right direction?