There seems to be varying answers on what the definition of "Nyquist Frequency" is, between my professor, myself, and various internet sites.
One group says:
"The Nyquist Frequency is 1/2 the sampling rate"
and the other group says:
"The Nyquist Frequency is twice the signal's bandwidth"
Both groups agree that your sampling rate must be twice the signal's bandwidth to avoid aliasing, but when a problem simply says "the sampling rate is x, and the signal's bandwidth is y, what is the Nyquist Frequency?" I get different answers depending on which definition I use.
Which is the correct definition, and how can I argue this to my professor?
The Attempt at a Solution
Sources, group 1:
Sources, group 2:
Google "Nyquist Frequency Definition" and google shows up as group 2.