The Asymtotic Eiegen Values of a Circulant matrix

  • Thread starter Thread starter EngWiPy
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
The eigenvalues of a circulant matrix are determined by the Fourier transform of its first row, expressed as λ_n = ∑_l h_l exp(-j(2π/N)nl). Analyzing these eigenvalues in an asymptotic sense as N approaches infinity suggests that they cannot all be equal unless the matrix is the identity matrix. If the eigenvalues vary slightly, the spectrum resembles "white" noise, indicating a Gaussian random variable distribution. A counter-example is a matrix filled with ones, which consistently has one non-zero eigenvalue and the rest zero, demonstrating that the initial proposition is generally untrue. Thus, the behavior of eigenvalues in circulant matrices is complex and not uniform across all cases.
EngWiPy
Messages
1,361
Reaction score
61
Hi,

The eigenvalues of a circulant matrix are given by:

\lambda_n=\sum_{l=0}^Lh_l\exp\left(-j\frac{2\pi}{N}nl\right)

for n=0,1,...N-1. Is it legal to do analysis in asymptiptic sense (as N approaches infinity), in which case:

\lambda_1=\cdots=\lambda_N=\sum_{l=0}^Lh_l??

Thanks
 
Mathematics news on Phys.org
I'm not a mathematician so my answer will lack rigor, but here goes:

I would think the answer is no. The eigenvalues are the Fourier transform or spectrum of the top row (or first column) of the circulant matrix. For the eigenvalues to all be equal defines a constant spectrum, which implies that the first row consists of a one followed by all zeros. Note that this matrix is the identity matrix. If the eigenvalues are instead allowed to vary randomly by a little bit, then the spectrum looks approximately "white" such as you see for Gaussian noise. The row vector is thus a sequence drawn from a Gaussian random variable. Both of these are very special cases, of course, and there is no reason to believe that an arbitrary circulant matrix will resemble them. Hence I believe that your proposition is untrue in general.

EDIT: Just thought of an obvious counter-example: a matrix of all 1's. No matter how large it gets, it has one non-zero eigenvalue and all rest zeros.
 
Last edited:
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K