1. The problem statement, all variables and given/known data There seems to be varying answers on what the definition of "Nyquist Frequency" is, between my professor, myself, and various internet sites. One group says: "The Nyquist Frequency is 1/2 the sampling rate" and the other group says: "The Nyquist Frequency is twice the signal's bandwidth" Both groups agree that your sampling rate must be twice the signal's bandwidth to avoid aliasing, but when a problem simply says "the sampling rate is x, and the signal's bandwidth is y, what is the Nyquist Frequency?" I get different answers depending on which definition I use. Which is the correct definition, and how can I argue this to my professor? 2. Relevant equations 3. The attempt at a solution Sources, group 1: http://en.wikipedia.org/wiki/Nyquist_frequency http://www.encyclopedia.com/doc/1O13-Nyquistfrequency.html Sources, group 2: http://www2.egr.uh.edu/~glover/applets/Sampling/Sampling.html Google "Nyquist Frequency Definition" and google shows up as group 2.