1. The problem statement, all variables and given/known data x[n] = Ʃ ck * δ(n-k), from k = -N to N. Plot the DTFT as a function of the number of terms N. This is a finite sum. 2. Relevant equations The equation for the DTFT of a signal, which is Ʃ x[n] * e-j*2∏*∅*n, from n = -∞ to +∞ 3. The attempt at a solution I have nothing.... something like N = 50; ck = 1; X = ; phi = 1/(2*pi); for n = -1000:1:1000 sum = 0; for k = -N:1:N sum = sum + dirac(n-k); end X = [X sum*exp(-1i*2*phi*n)]; end n = -1000:1:1000; plot(n, X); But this is obviously wrong. I don't know how to compute the DTFT for a signal of a finite length like that, much less plot it using MATLAB!