A Memory issues in numerical integration of oscillatory function

AI Thread Summary
The discussion focuses on the challenges of numerically integrating a decaying complex function over an infinite interval, particularly when using Python libraries. Initial attempts with the quadrature method faced convergence issues, prompting a switch to the fast Fourier transform (FFT) for improved speed, which introduced memory constraints with large intervals and small step sizes. The proposed solution involves subdividing the integration interval, applying FFT to each subinterval, and summing the results to achieve convergence. Participants also discuss the importance of applying the scaling theorem for FFTs and addressing potential Gibbs oscillations by windowing segments. The conversation highlights the utility of periodicity in the integration process and explores alternative methods for handling oscillatory functions.
cyberpotato
Messages
2
Reaction score
0
TL;DR Summary
Memory challenges in numerically integrating oscillatory functions using FFT in Python. Starting this thread here as a related thread was discussed earlier: https://www.physicsforums.com/threads/numerical-integration-fourier-transform-or-brute-force.283916/#post-2030214
Hello!

I need to numerically integrate a frequently oscillating, decaying complex function over the interval from 0 to infinity, which is continuous. For brevity, I provide the general integral view
$$\int_{0}^{\infty} A(t)e^{e^{iw't}}dt$$.
I'm using Python libraries for this task. Initially, I tried the quadrature method, but encountered convergence issues when integrating over a large interval. To address this, I started subdividing the integration interval into subintervals, integrating them separately, and summing the results until convergence. However, this process was time-consuming.
To speed up the integration, I switched to using the fast Fourier transform (FFT) from scipy.fft. While this approach improved integration speed, a new issue arose. When passing a vector of function values to scipy.fft, I faced memory constraints for very large integration intervals with small step sizes. To solve this, I considered subdividing the large integration interval, performing FFT on each subinterval, and summing the results until convergence.

Is it correct to solve this issue by breaking down the large integration interval into subintervals, performing FFT on each subinterval, and summing until convergence? Additionally, are there alternative methods for integrating such functions that I should consider?
 
Mathematics news on Phys.org
It may be worth setting <br /> I_n = \int_{0}^{2\pi/\omega&#039;} A\left(\frac{2n\pi}{\omega&#039;} + t\right)e^{e^{i\omega&#039;t}}\,dt, \qquad n \geq 0 and seeing how fast |I_n| decays with n. Your integral can then be approximated as \sum_{n=0}^N I_n for some N.
 
I think what you are asking is that you have,
$$
\int_0^{\infty} A(t)[1 + e^{i\omega t}+ \frac{e^{2i\omega t}}{2!} +\frac{e^{3i\omega t}}{3!} + ...]dt
$$
and wish to take the FFT of your time series against each succeeding term in the expansion. Firstly, you must apply the scaling theorem for FFTs to each term (see Scaling Theorem )which will probably result in interpolation issues which you must resolve. Secondly, by breaking up your time series into segments and taking the FFT of each segment Gibbs oscillations raise their ugly heads. Therefore you must window each segment using a Hamming window or whatnot.
 
  • Informative
Likes pbuk and Delta2
pasmith said:
It may be worth setting <br /> I_n = \int_{0}^{2\pi/\omega&#039;} A\left(\frac{2n\pi}{\omega&#039;} + t\right)e^{e^{i\omega&#039;t}}\,dt, \qquad n \geq 0 and seeing how fast |I_n| decays with n. Your integral can then be approximated as \sum_{n=0}^N I_n for some N.
Your approach looks very interesting, thank you. I'm trying to understand it. Are you using the periodicity property of the exp(iωt) function? In the case A(t) = 1, the integral will have the form:
$$\int_{0}^{\infty }e^{e^{i\omega t}}=\sum_{n=0}^N \int_{0}^{2\pi/\omega}e^{e^{i\omega t}}$$
Is that right?
 
Thread 'Video on imaginary numbers and some queries'
Hi, I was watching the following video. I found some points confusing. Could you please help me to understand the gaps? Thanks, in advance! Question 1: Around 4:22, the video says the following. So for those mathematicians, negative numbers didn't exist. You could subtract, that is find the difference between two positive quantities, but you couldn't have a negative answer or negative coefficients. Mathematicians were so averse to negative numbers that there was no single quadratic...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Thread 'Unit Circle Double Angle Derivations'
Here I made a terrible mistake of assuming this to be an equilateral triangle and set 2sinx=1 => x=pi/6. Although this did derive the double angle formulas it also led into a terrible mess trying to find all the combinations of sides. I must have been tired and just assumed 6x=180 and 2sinx=1. By that time, I was so mindset that I nearly scolded a person for even saying 90-x. I wonder if this is a case of biased observation that seeks to dis credit me like Jesus of Nazareth since in reality...

Similar threads

Back
Top