1. The problem statement, all variables and given/known data The signal is s=cos(12*pi*t) and the time vector is 0 to 10s. One vector has increments of 0.1s and the other is 0.01s. What is the plotted frequency from these time scales? And why does it change by changing the increments? 2. Relevant equations f=1/T 3. The attempt at a solution So we're supposed to find the frequencies from the plot of the graphs. For the 0.1s increments, the frequency seems to be 1Hz (T≈1 ∴ f=1/1). And for the 0.01s increments, the frequency seems to be 10Hz (T≈0.1 ∴ f=1/0.1). She wants us to explain this and I don't get it. Although, I think it has to do with frequency offset. Explain please? Hope you can help. Thanks!