- #1

axmls

- 944

- 394

A constant signal has an arbitrarily small period (rather, it has no fundamental period), and so it seems to me that this means the frequency of a constant signal grows without bound. However, in Fourier analysis, for instance, we treat constant signals as having a frequency component only at ##f = 0##. Why, mathematically, can we not say that a constant signal has (approaching) infinite frequency since ##f = 1/T## and a constant function has arbitrarily small ##T##? I mean, certainly from an experimental basis (i.e. Designing a high pass filter to get rid of a constant component of a signal), ##f=0## corresponds to a constant signal, but I'd like a mathematical reason for this.